Initial commit: Venus OS boat addons monorepo
Organizes 11 projects for Cerbo GX/Venus OS into a single repository: - axiom-nmea: Raymarine LightHouse protocol decoder - dbus-generator-ramp: Generator current ramp controller - dbus-lightning: Blitzortung lightning monitor - dbus-meteoblue-forecast: Meteoblue weather forecast - dbus-no-foreign-land: noforeignland.com tracking - dbus-tides: Tide prediction from depth + harmonics - dbus-vrm-history: VRM cloud history proxy - dbus-windy-station: Windy.com weather upload - mfd-custom-app: MFD app deployment package - venus-html5-app: Custom Victron HTML5 app fork - watermaker: Watermaker PLC control UI Adds root README, .gitignore, project template, and per-project .gitignore files. Sensitive config files excluded via .gitignore with .example templates provided. Made-with: Cursor
This commit is contained in:
53
.gitignore
vendored
Normal file
53
.gitignore
vendored
Normal file
@@ -0,0 +1,53 @@
|
||||
# Python
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
*.so
|
||||
.Python
|
||||
*.egg
|
||||
*.egg-info/
|
||||
venv/
|
||||
.venv/
|
||||
|
||||
# Node
|
||||
node_modules/
|
||||
dist/
|
||||
build/
|
||||
*.tgz
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
|
||||
# Venus OS runtime (created by install.sh on device)
|
||||
ext/velib_python
|
||||
|
||||
# Build artifacts
|
||||
*.tar.gz
|
||||
*.sha256
|
||||
|
||||
# IDE
|
||||
.idea/
|
||||
.vscode/
|
||||
*.swp
|
||||
*.swo
|
||||
*~
|
||||
|
||||
# OS
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# Environment / secrets
|
||||
.env.local
|
||||
.env.development.local
|
||||
.env.test.local
|
||||
.env.production.local
|
||||
|
||||
# Sensitive config files (use *.example.json templates instead)
|
||||
dbus-meteoblue-forecast/forecast_config.json
|
||||
dbus-windy-station/station_config.json
|
||||
|
||||
# Databases
|
||||
*.db
|
||||
|
||||
# Design reference assets (kept locally, not in repo)
|
||||
inspiration assets/
|
||||
79
README.md
Normal file
79
README.md
Normal file
@@ -0,0 +1,79 @@
|
||||
# Venus OS Boat Addons
|
||||
|
||||
Custom addons for a Cerbo GX running [Venus OS](https://github.com/victronenergy/venus) on a sailboat. These services extend the system with weather monitoring, tide prediction, navigation tracking, generator management, and custom MFD displays.
|
||||
|
||||
All D-Bus services follow the same deployment pattern: build a `.tar.gz` package, copy it to `/data/` on the Cerbo GX, and run `install.sh`. Services are managed by daemontools and survive firmware updates via `rc.local`.
|
||||
|
||||
## Projects
|
||||
|
||||
| Project | Type | Language | Description |
|
||||
|---------|------|----------|-------------|
|
||||
| [axiom-nmea](axiom-nmea/) | Library + Service | Python | Decodes Raymarine LightHouse protobuf multicast into NMEA 0183 sentences and Venus OS D-Bus services. Includes protocol documentation, debug tools, and a deployable D-Bus publisher. |
|
||||
| [dbus-generator-ramp](dbus-generator-ramp/) | D-Bus Service | Python | Gradually ramps inverter/charger input current when running on generator to avoid overload. Features warm-up hold, overload detection with fast recovery, and a persistent power correlation learning model. |
|
||||
| [dbus-lightning](dbus-lightning/) | D-Bus Service | Python | Monitors real-time lightning strikes from the Blitzortung network via WebSocket. Filters by distance, analyzes storm approach speed, and estimates ETA. |
|
||||
| [dbus-meteoblue-forecast](dbus-meteoblue-forecast/) | D-Bus Service | Python | Fetches 7-day weather forecasts from the Meteoblue API (wind, waves, precipitation, temperature). Adjusts refresh rate based on boat movement. |
|
||||
| [dbus-no-foreign-land](dbus-no-foreign-land/) | D-Bus Service | Python | Sends GPS position and track data to noforeignland.com. Includes a QML settings page for the Venus OS GUI. |
|
||||
| [dbus-tides](dbus-tides/) | D-Bus Service | Python | Predicts tides by combining depth sensor readings with harmonic tidal models (NOAA stations and coastal grid). Records depth history in SQLite, detects high/low tides, and calibrates to chart depth. |
|
||||
| [dbus-vrm-history](dbus-vrm-history/) | D-Bus Service | Python | Proxies historical data from the VRM cloud API and exposes it on D-Bus/MQTT for the frontend dashboard. |
|
||||
| [dbus-windy-station](dbus-windy-station/) | D-Bus Service | Python | Uploads weather observations from Raymarine sensors to Windy.com Stations API. Supports both legacy and v2 Venus OS GUI plugins. |
|
||||
| [mfd-custom-app](mfd-custom-app/) | Deployment Package | Shell | Builds and deploys the custom HTML5 app to the Cerbo GX. Overrides the stock Victron app with custom pages served via nginx. Supports SSH and USB installation. |
|
||||
| [venus-html5-app](venus-html5-app/) | Frontend App | TypeScript/React | Fork of the Victron Venus HTML5 app with custom Marine2 views for weather, tides, tracking, generator status, and mooring. Displayed on Raymarine Axiom and other MFDs. |
|
||||
| [watermaker](watermaker/) | UI + API Docs | React/JS | Control interface for a watermaker PLC system. React SPA with REST/WebSocket/MQTT integration. Backend runs on a separate PLC controller. |
|
||||
|
||||
## Common D-Bus Service Structure
|
||||
|
||||
All Python D-Bus services share this layout:
|
||||
|
||||
```
|
||||
dbus-<name>/
|
||||
├── <name>.py # Main service entry point
|
||||
├── config.py # Configuration constants
|
||||
├── service/
|
||||
│ ├── run # daemontools entry point
|
||||
│ └── log/run # multilog configuration
|
||||
├── install.sh # Venus OS installation
|
||||
├── uninstall.sh # Cleanup
|
||||
├── build-package.sh # Creates deployable .tar.gz
|
||||
└── README.md
|
||||
```
|
||||
|
||||
At install time, `install.sh` symlinks `velib_python` from `/opt/victronenergy/`, registers the service with daemontools, and adds an `rc.local` entry for persistence across firmware updates.
|
||||
|
||||
## Deployment
|
||||
|
||||
```bash
|
||||
# Build a package
|
||||
cd dbus-<name>
|
||||
./build-package.sh
|
||||
|
||||
# Copy to Cerbo GX
|
||||
scp dbus-<name>-*.tar.gz root@<cerbo-ip>:/data/
|
||||
|
||||
# Install on device
|
||||
ssh root@<cerbo-ip>
|
||||
cd /data && tar -xzf dbus-<name>-*.tar.gz
|
||||
bash /data/dbus-<name>/install.sh
|
||||
|
||||
# Check service status
|
||||
svstat /service/dbus-<name>
|
||||
tail -f /var/log/dbus-<name>/current | tai64nlocal
|
||||
```
|
||||
|
||||
## Development Prerequisites
|
||||
|
||||
- **Python 3.8+** -- for all D-Bus services
|
||||
- **Node.js 20+** and **npm** -- for venus-html5-app and watermaker UI
|
||||
- **SSH/root access** to the Cerbo GX
|
||||
- **Venus OS v3.10+** on the target device
|
||||
|
||||
## Sensitive Configuration
|
||||
|
||||
Some projects require API keys or credentials. These are excluded from version control. Copy the example templates and fill in your values:
|
||||
|
||||
- `dbus-meteoblue-forecast/forecast_config.example.json` --> `forecast_config.json`
|
||||
- `dbus-windy-station/station_config.example.json` --> `station_config.json`
|
||||
- `watermaker/ui/.env.example` --> `.env`
|
||||
|
||||
## New Project Template
|
||||
|
||||
See [`_template/`](_template/) for a skeleton D-Bus service with all the boilerplate: main script, config, daemontools service, install/uninstall scripts, and build packaging.
|
||||
24
_template/.gitignore
vendored
Normal file
24
_template/.gitignore
vendored
Normal file
@@ -0,0 +1,24 @@
|
||||
# Build artifacts
|
||||
*.tar.gz
|
||||
*.sha256
|
||||
|
||||
# Python
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
*.so
|
||||
.Python
|
||||
*.egg
|
||||
*.egg-info/
|
||||
dist/
|
||||
build/
|
||||
|
||||
# IDE
|
||||
.vscode/
|
||||
.idea/
|
||||
*.swp
|
||||
*.swo
|
||||
*~
|
||||
|
||||
# Venus OS runtime (created during installation)
|
||||
ext/
|
||||
44
_template/README.md
Normal file
44
_template/README.md
Normal file
@@ -0,0 +1,44 @@
|
||||
# dbus-template
|
||||
|
||||
Venus OS D-Bus service template. Use this as a starting point for new services.
|
||||
|
||||
## Getting Started
|
||||
|
||||
1. Copy this directory: `cp -r _template/ dbus-your-service-name/`
|
||||
2. Rename `dbus-template.py` to match your service (e.g. `your_service.py`)
|
||||
3. Update `config.py` with your service name, product ID, and settings
|
||||
4. Update `service/run` to point to your renamed main script
|
||||
5. Update `install.sh`: set `SERVICE_NAME` and `MAIN_SCRIPT` variables
|
||||
6. Update `build-package.sh`: set `PACKAGE_NAME` and the file copy list
|
||||
7. Replace this README with your own documentation
|
||||
|
||||
## Files
|
||||
|
||||
| File | Purpose |
|
||||
|------|---------|
|
||||
| `dbus-template.py` | Main service with D-Bus registration boilerplate |
|
||||
| `config.py` | Configuration constants (service name, product ID, timing) |
|
||||
| `service/run` | daemontools entry point |
|
||||
| `service/log/run` | multilog configuration |
|
||||
| `install.sh` | Venus OS installation (velib symlink, service registration, rc.local) |
|
||||
| `uninstall.sh` | Removes service symlink |
|
||||
| `build-package.sh` | Creates a deployable .tar.gz package |
|
||||
|
||||
## D-Bus Paths
|
||||
|
||||
Update these for your service:
|
||||
|
||||
| Path | Type | Description |
|
||||
|------|------|-------------|
|
||||
| `/ProductName` | string | Service display name |
|
||||
| `/Connected` | int | Connection status (0/1) |
|
||||
| `/Settings/Template/Enabled` | int | Enable/disable via settings |
|
||||
|
||||
## Checklist
|
||||
|
||||
- [ ] Unique service name in `config.py` (`com.victronenergy.yourservice`)
|
||||
- [ ] Unique product ID in `config.py` (check existing services to avoid conflicts)
|
||||
- [ ] All file references updated in `service/run`, `install.sh`, `build-package.sh`
|
||||
- [ ] Custom D-Bus paths added
|
||||
- [ ] Settings paths updated
|
||||
- [ ] README replaced with your own documentation
|
||||
68
_template/build-package.sh
Executable file
68
_template/build-package.sh
Executable file
@@ -0,0 +1,68 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# Build script for Venus OS D-Bus service package
|
||||
#
|
||||
# Usage:
|
||||
# ./build-package.sh
|
||||
# ./build-package.sh --version 1.0.0
|
||||
#
|
||||
|
||||
set -e
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
VERSION="0.1.0"
|
||||
OUTPUT_DIR="$SCRIPT_DIR"
|
||||
PACKAGE_NAME="dbus-template"
|
||||
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case $1 in
|
||||
--version|-v) VERSION="$2"; shift 2 ;;
|
||||
--output|-o) OUTPUT_DIR="$2"; shift 2 ;;
|
||||
--help|-h)
|
||||
echo "Usage: $0 [--version VERSION] [--output PATH]"
|
||||
exit 0
|
||||
;;
|
||||
*) echo "Unknown option: $1"; exit 1 ;;
|
||||
esac
|
||||
done
|
||||
|
||||
BUILD_DIR=$(mktemp -d)
|
||||
PACKAGE_DIR="$BUILD_DIR/$PACKAGE_NAME"
|
||||
|
||||
echo "Building $PACKAGE_NAME v$VERSION..."
|
||||
|
||||
mkdir -p "$PACKAGE_DIR/service/log"
|
||||
|
||||
# Copy application files -- update this list for your service
|
||||
cp "$SCRIPT_DIR/dbus-template.py" "$PACKAGE_DIR/"
|
||||
cp "$SCRIPT_DIR/config.py" "$PACKAGE_DIR/"
|
||||
|
||||
# Copy service and install files
|
||||
cp "$SCRIPT_DIR/service/run" "$PACKAGE_DIR/service/"
|
||||
cp "$SCRIPT_DIR/service/log/run" "$PACKAGE_DIR/service/log/"
|
||||
cp "$SCRIPT_DIR/install.sh" "$PACKAGE_DIR/"
|
||||
cp "$SCRIPT_DIR/uninstall.sh" "$PACKAGE_DIR/"
|
||||
|
||||
# Set permissions
|
||||
chmod +x "$PACKAGE_DIR/dbus-template.py"
|
||||
chmod +x "$PACKAGE_DIR/install.sh"
|
||||
chmod +x "$PACKAGE_DIR/uninstall.sh"
|
||||
chmod +x "$PACKAGE_DIR/service/run"
|
||||
chmod +x "$PACKAGE_DIR/service/log/run"
|
||||
|
||||
# Create archive
|
||||
mkdir -p "$OUTPUT_DIR"
|
||||
TARBALL="$PACKAGE_NAME-$VERSION.tar.gz"
|
||||
OUTPUT_ABS="$(cd "$OUTPUT_DIR" && pwd)"
|
||||
cd "$BUILD_DIR"
|
||||
tar --format=ustar -czf "$OUTPUT_ABS/$TARBALL" "$PACKAGE_NAME"
|
||||
rm -rf "$BUILD_DIR"
|
||||
|
||||
echo "Package: $OUTPUT_ABS/$TARBALL"
|
||||
echo ""
|
||||
echo "Install on Venus OS:"
|
||||
echo " scp $OUTPUT_ABS/$TARBALL root@<device-ip>:/data/"
|
||||
echo " ssh root@<device-ip>"
|
||||
echo " cd /data && tar -xzf $TARBALL"
|
||||
echo " bash /data/$PACKAGE_NAME/install.sh"
|
||||
22
_template/config.py
Normal file
22
_template/config.py
Normal file
@@ -0,0 +1,22 @@
|
||||
"""
|
||||
Configuration for dbus-YOUR-SERVICE-NAME.
|
||||
|
||||
Rename this file is not needed -- just update the values below.
|
||||
"""
|
||||
|
||||
# Service identity
|
||||
SERVICE_NAME = 'com.victronenergy.yourservice'
|
||||
DEVICE_INSTANCE = 0
|
||||
PRODUCT_NAME = 'Your Service Name'
|
||||
PRODUCT_ID = 0xA1FF # Pick a unique product ID (0xA100-0xA1FF range)
|
||||
FIRMWARE_VERSION = 0
|
||||
CONNECTED = 1
|
||||
|
||||
# Version
|
||||
VERSION = '0.1.0'
|
||||
|
||||
# Timing
|
||||
MAIN_LOOP_INTERVAL_MS = 1000 # Main loop tick (milliseconds)
|
||||
|
||||
# Logging
|
||||
LOG_LEVEL = 'INFO' # DEBUG, INFO, WARNING, ERROR
|
||||
131
_template/dbus-template.py
Executable file
131
_template/dbus-template.py
Executable file
@@ -0,0 +1,131 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Venus OS D-Bus service template.
|
||||
|
||||
To create a new service:
|
||||
1. Copy the _template/ directory and rename it to dbus-<your-service-name>/
|
||||
2. Rename this file to match your service (e.g. your_service.py)
|
||||
3. Update config.py with your service name, product ID, etc.
|
||||
4. Update service/run to point to your renamed script
|
||||
5. Update install.sh with your service name
|
||||
6. Update build-package.sh with your file list
|
||||
"""
|
||||
|
||||
import logging
|
||||
import os
|
||||
import signal
|
||||
import sys
|
||||
import time
|
||||
|
||||
sys.path.insert(1, os.path.join(os.path.dirname(__file__), 'ext', 'velib_python'))
|
||||
|
||||
from vedbus import VeDbusService # noqa: E402
|
||||
from settingsdevice import SettingsDevice # noqa: E402
|
||||
import dbus # noqa: E402
|
||||
from gi.repository import GLib # noqa: E402
|
||||
|
||||
from config import ( # noqa: E402
|
||||
SERVICE_NAME,
|
||||
DEVICE_INSTANCE,
|
||||
PRODUCT_NAME,
|
||||
PRODUCT_ID,
|
||||
FIRMWARE_VERSION,
|
||||
CONNECTED,
|
||||
MAIN_LOOP_INTERVAL_MS,
|
||||
LOG_LEVEL,
|
||||
VERSION,
|
||||
)
|
||||
|
||||
logger = logging.getLogger('dbus-template')
|
||||
|
||||
|
||||
class TemplateService:
|
||||
"""Main service class. Rename to match your service."""
|
||||
|
||||
def __init__(self):
|
||||
self._running = False
|
||||
self._dbusservice = None
|
||||
self._settings = None
|
||||
|
||||
def start(self):
|
||||
"""Initialize D-Bus service and start the main loop."""
|
||||
bus = dbus.SystemBus()
|
||||
|
||||
self._dbusservice = VeDbusService(
|
||||
SERVICE_NAME, bus=bus, register=False
|
||||
)
|
||||
|
||||
# Mandatory D-Bus paths
|
||||
self._dbusservice.add_path('/Mgmt/ProcessName', __file__)
|
||||
self._dbusservice.add_path('/Mgmt/ProcessVersion', VERSION)
|
||||
self._dbusservice.add_path('/Mgmt/Connection', 'local')
|
||||
self._dbusservice.add_path('/DeviceInstance', DEVICE_INSTANCE)
|
||||
self._dbusservice.add_path('/ProductId', PRODUCT_ID)
|
||||
self._dbusservice.add_path('/ProductName', PRODUCT_NAME)
|
||||
self._dbusservice.add_path('/FirmwareVersion', FIRMWARE_VERSION)
|
||||
self._dbusservice.add_path('/Connected', CONNECTED)
|
||||
|
||||
# --- Add your custom D-Bus paths here ---
|
||||
# self._dbusservice.add_path('/YourPath', initial_value)
|
||||
|
||||
# Settings (stored in Venus OS localsettings, persist across reboots)
|
||||
settings_path = '/Settings/Template'
|
||||
supported_settings = {
|
||||
'enabled': [settings_path + '/Enabled', 1, 0, 1],
|
||||
# 'your_setting': [settings_path + '/YourSetting', default, min, max],
|
||||
}
|
||||
self._settings = SettingsDevice(
|
||||
bus, supported_settings, self._on_setting_changed
|
||||
)
|
||||
|
||||
self._dbusservice.register()
|
||||
logger.info('Service registered on D-Bus as %s', SERVICE_NAME)
|
||||
|
||||
self._running = True
|
||||
GLib.timeout_add(MAIN_LOOP_INTERVAL_MS, self._update)
|
||||
|
||||
def _update(self):
|
||||
"""Called every MAIN_LOOP_INTERVAL_MS. Return True to keep running."""
|
||||
if not self._running:
|
||||
return False
|
||||
|
||||
# --- Add your main loop logic here ---
|
||||
|
||||
return True
|
||||
|
||||
def _on_setting_changed(self, setting, old, new):
|
||||
"""Called when a Venus OS setting changes."""
|
||||
logger.info('Setting %s changed: %s -> %s', setting, old, new)
|
||||
|
||||
def stop(self):
|
||||
"""Clean shutdown."""
|
||||
self._running = False
|
||||
logger.info('Service stopped')
|
||||
|
||||
|
||||
def main():
|
||||
logging.basicConfig(
|
||||
level=getattr(logging, LOG_LEVEL, logging.INFO),
|
||||
format='%(asctime)s %(name)s %(levelname)s %(message)s',
|
||||
datefmt='%Y-%m-%d %H:%M:%S',
|
||||
)
|
||||
logger.info('Starting dbus-template v%s', VERSION)
|
||||
|
||||
service = TemplateService()
|
||||
|
||||
def shutdown(signum, frame):
|
||||
logger.info('Received signal %d, shutting down...', signum)
|
||||
service.stop()
|
||||
sys.exit(0)
|
||||
|
||||
signal.signal(signal.SIGTERM, shutdown)
|
||||
signal.signal(signal.SIGINT, shutdown)
|
||||
|
||||
service.start()
|
||||
|
||||
mainloop = GLib.MainLoop()
|
||||
mainloop.run()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
115
_template/install.sh
Executable file
115
_template/install.sh
Executable file
@@ -0,0 +1,115 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# Installation script for Venus OS D-Bus service template
|
||||
#
|
||||
# Usage:
|
||||
# chmod +x install.sh
|
||||
# ./install.sh
|
||||
#
|
||||
|
||||
set -e
|
||||
|
||||
SERVICE_NAME="dbus-template"
|
||||
INSTALL_DIR="/data/$SERVICE_NAME"
|
||||
MAIN_SCRIPT="dbus-template.py"
|
||||
|
||||
# Find velib_python
|
||||
VELIB_DIR=""
|
||||
if [ -d "/opt/victronenergy/velib_python" ]; then
|
||||
VELIB_DIR="/opt/victronenergy/velib_python"
|
||||
else
|
||||
for candidate in \
|
||||
"/opt/victronenergy/dbus-systemcalc-py/ext/velib_python" \
|
||||
"/opt/victronenergy/dbus-generator/ext/velib_python" \
|
||||
"/opt/victronenergy/dbus-mqtt/ext/velib_python" \
|
||||
"/opt/victronenergy/dbus-digitalinputs/ext/velib_python" \
|
||||
"/opt/victronenergy/vrmlogger/ext/velib_python"
|
||||
do
|
||||
if [ -d "$candidate" ] && [ -f "$candidate/vedbus.py" ]; then
|
||||
VELIB_DIR="$candidate"
|
||||
break
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
if [ -z "$VELIB_DIR" ]; then
|
||||
VEDBUS_PATH=$(find /opt/victronenergy -name "vedbus.py" -path "*/velib_python/*" 2>/dev/null | head -1)
|
||||
if [ -n "$VEDBUS_PATH" ]; then
|
||||
VELIB_DIR=$(dirname "$VEDBUS_PATH")
|
||||
fi
|
||||
fi
|
||||
|
||||
# Determine service directory
|
||||
if [ -d "/service" ] && [ ! -L "/service" ]; then
|
||||
SERVICE_DIR="/service"
|
||||
elif [ -d "/opt/victronenergy/service" ]; then
|
||||
SERVICE_DIR="/opt/victronenergy/service"
|
||||
elif [ -L "/service" ]; then
|
||||
SERVICE_DIR=$(readlink -f /service)
|
||||
else
|
||||
SERVICE_DIR="/opt/victronenergy/service"
|
||||
fi
|
||||
|
||||
echo "=================================================="
|
||||
echo "$SERVICE_NAME - Installation"
|
||||
echo "=================================================="
|
||||
|
||||
if [ ! -d "$SERVICE_DIR" ]; then
|
||||
echo "ERROR: This doesn't appear to be a Venus OS device."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ ! -f "$INSTALL_DIR/$MAIN_SCRIPT" ]; then
|
||||
echo "ERROR: Installation files not found in $INSTALL_DIR"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "1. Making scripts executable..."
|
||||
chmod +x "$INSTALL_DIR/service/run"
|
||||
chmod +x "$INSTALL_DIR/service/log/run"
|
||||
chmod +x "$INSTALL_DIR/$MAIN_SCRIPT"
|
||||
|
||||
echo "2. Creating velib_python symlink..."
|
||||
if [ -z "$VELIB_DIR" ]; then
|
||||
echo "ERROR: Could not find velib_python on this system."
|
||||
exit 1
|
||||
fi
|
||||
echo " Found velib_python at: $VELIB_DIR"
|
||||
mkdir -p "$INSTALL_DIR/ext"
|
||||
if [ -L "$INSTALL_DIR/ext/velib_python" ]; then
|
||||
rm "$INSTALL_DIR/ext/velib_python"
|
||||
fi
|
||||
ln -s "$VELIB_DIR" "$INSTALL_DIR/ext/velib_python"
|
||||
|
||||
echo "3. Creating service symlink..."
|
||||
if [ -L "$SERVICE_DIR/$SERVICE_NAME" ] || [ -e "$SERVICE_DIR/$SERVICE_NAME" ]; then
|
||||
rm -rf "$SERVICE_DIR/$SERVICE_NAME"
|
||||
fi
|
||||
ln -s "$INSTALL_DIR/service" "$SERVICE_DIR/$SERVICE_NAME"
|
||||
|
||||
echo "4. Creating log directory..."
|
||||
mkdir -p "/var/log/$SERVICE_NAME"
|
||||
|
||||
echo "5. Setting up rc.local for persistence..."
|
||||
RC_LOCAL="/data/rc.local"
|
||||
if [ ! -f "$RC_LOCAL" ]; then
|
||||
echo "#!/bin/bash" > "$RC_LOCAL"
|
||||
chmod +x "$RC_LOCAL"
|
||||
fi
|
||||
|
||||
if ! grep -q "$SERVICE_NAME" "$RC_LOCAL"; then
|
||||
echo "" >> "$RC_LOCAL"
|
||||
echo "# $SERVICE_NAME" >> "$RC_LOCAL"
|
||||
echo "if [ ! -L $SERVICE_DIR/$SERVICE_NAME ]; then" >> "$RC_LOCAL"
|
||||
echo " ln -s /data/$SERVICE_NAME/service $SERVICE_DIR/$SERVICE_NAME" >> "$RC_LOCAL"
|
||||
echo "fi" >> "$RC_LOCAL"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "=================================================="
|
||||
echo "Installation complete!"
|
||||
echo "=================================================="
|
||||
echo ""
|
||||
echo "To check status: svstat $SERVICE_DIR/$SERVICE_NAME"
|
||||
echo "To view logs: tail -F /var/log/$SERVICE_NAME/current | tai64nlocal"
|
||||
echo ""
|
||||
2
_template/service/log/run
Executable file
2
_template/service/log/run
Executable file
@@ -0,0 +1,2 @@
|
||||
#!/bin/sh
|
||||
exec multilog t s99999 n8 /var/log/dbus-template
|
||||
5
_template/service/run
Executable file
5
_template/service/run
Executable file
@@ -0,0 +1,5 @@
|
||||
#!/bin/sh
|
||||
exec 2>&1
|
||||
cd /data/dbus-template
|
||||
export PYTHONPATH="/data/dbus-template/ext/velib_python:$PYTHONPATH"
|
||||
exec python3 /data/dbus-template/dbus-template.py
|
||||
32
_template/uninstall.sh
Executable file
32
_template/uninstall.sh
Executable file
@@ -0,0 +1,32 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# Uninstall script for Venus OS D-Bus service template
|
||||
#
|
||||
|
||||
set -e
|
||||
|
||||
SERVICE_NAME="dbus-template"
|
||||
|
||||
if [ -d "/service" ] && [ ! -L "/service" ]; then
|
||||
SERVICE_DIR="/service"
|
||||
elif [ -d "/opt/victronenergy/service" ]; then
|
||||
SERVICE_DIR="/opt/victronenergy/service"
|
||||
elif [ -L "/service" ]; then
|
||||
SERVICE_DIR=$(readlink -f /service)
|
||||
else
|
||||
SERVICE_DIR="/opt/victronenergy/service"
|
||||
fi
|
||||
|
||||
echo "Uninstalling $SERVICE_NAME..."
|
||||
|
||||
if [ -L "$SERVICE_DIR/$SERVICE_NAME" ]; then
|
||||
svc -d "$SERVICE_DIR/$SERVICE_NAME" 2>/dev/null || true
|
||||
sleep 2
|
||||
rm "$SERVICE_DIR/$SERVICE_NAME"
|
||||
echo "Service symlink removed"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "Uninstall complete."
|
||||
echo "Files remain in /data/$SERVICE_NAME/ -- remove manually if desired."
|
||||
echo ""
|
||||
24
axiom-nmea/.gitignore
vendored
Normal file
24
axiom-nmea/.gitignore
vendored
Normal file
@@ -0,0 +1,24 @@
|
||||
# Python
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
*.so
|
||||
.Python
|
||||
*.egg-info/
|
||||
dist/
|
||||
build/
|
||||
|
||||
# IDE
|
||||
.idea/
|
||||
.vscode/
|
||||
*.swp
|
||||
*.swo
|
||||
|
||||
# Packet captures (keep samples/ dir but ignore pcap files within)
|
||||
captures/
|
||||
samples/*.pcap
|
||||
*.bin
|
||||
|
||||
# Environment
|
||||
.env
|
||||
venv/
|
||||
624
axiom-nmea/PROTOCOL.md
Normal file
624
axiom-nmea/PROTOCOL.md
Normal file
@@ -0,0 +1,624 @@
|
||||
# Raymarine LightHouse Protocol Analysis
|
||||
|
||||
## Overview
|
||||
|
||||
This document describes the findings from reverse-engineering the Raymarine LightHouse network protocol used by AXIOM MFDs to share sensor data over IP multicast.
|
||||
|
||||
**Key Discovery**: Raymarine does NOT use standard NMEA 0183 text sentences on its multicast network. Instead, it uses **Google Protocol Buffers** (protobuf) binary encoding over UDP multicast.
|
||||
|
||||
## Quick Reference
|
||||
|
||||
### Decoding Status Summary
|
||||
|
||||
| Sensor | Status | Field | Unit |
|
||||
|--------|--------|-------|------|
|
||||
| GPS Position | ✅ Reliable | 2.1, 2.2 | Decimal degrees |
|
||||
| **SOG (Speed Over Ground)** | ✅ Reliable | 5.5 | m/s → knots |
|
||||
| **COG (Course Over Ground)** | ✅ Reliable | 5.1 | Radians → degrees |
|
||||
| Compass Heading | ⚠️ Variable | 3.2 | Radians → degrees |
|
||||
| Wind Direction | ⚠️ Variable | 13.4 | Radians → degrees |
|
||||
| Wind Speed | ⚠️ Variable | 13.5, 13.6 | m/s → knots |
|
||||
| Depth | ⚠️ Variable | 7.1 | Meters → feet |
|
||||
| Barometric Pressure | ✅ Reliable | 15.1 | Pascals → mbar |
|
||||
| Water Temperature | ✅ Reliable | 15.9 | Kelvin → Celsius |
|
||||
| Air Temperature | ⚠️ Variable | 15.3 | Kelvin → Celsius |
|
||||
| Tank Levels | ✅ Reliable | 16 | Percentage (0-100%) |
|
||||
| House Batteries | ✅ Reliable | 20 | Volts (direct) |
|
||||
| Engine Batteries | ✅ Reliable | 14.3.4 | Volts (direct) |
|
||||
|
||||
### Primary Data Source
|
||||
- **Multicast:** `226.192.206.102:2565`
|
||||
- **Source IP:** `198.18.1.170` (AXIOM 12 Data Master)
|
||||
- **Packet Format:** 20-byte header + Protocol Buffers payload
|
||||
|
||||
---
|
||||
|
||||
## Network Configuration
|
||||
|
||||
### Multicast Groups
|
||||
|
||||
| Group | Port | Source IP | Device | Purpose |
|
||||
|-------|------|-----------|--------|---------|
|
||||
| 226.192.206.98 | 2561 | 10.22.6.115 | Unknown | Navigation (mostly zeros) |
|
||||
| 226.192.206.99 | 2562 | 198.18.1.170 | AXIOM 12 Data Master | Heartbeat/status |
|
||||
| 226.192.206.102 | 2565 | 198.18.1.170 | AXIOM 12 Data Master | **Primary sensor data** |
|
||||
| 226.192.219.0 | 3221 | 198.18.2.191 | AXIOM PLUS 12 RV | Display synchronization |
|
||||
|
||||
Additional groups that may contain sensor/tank data:
|
||||
- `226.192.206.100:2563`
|
||||
- `226.192.206.101:2564`
|
||||
- `239.2.1.1:2154`
|
||||
|
||||
### Data Sources
|
||||
|
||||
| IP Address | Ports | Device | Data Types |
|
||||
|------------|-------|--------|------------|
|
||||
| 198.18.1.170 | 35044, 41741 | AXIOM 12 (Data Master) | GPS, Wind, Depth, Heading, Temp, Tanks, Batteries |
|
||||
| 198.18.2.191 | 35022, 45403, 50194 | AXIOM PLUS 12 RV | Display sync, possible depth relay |
|
||||
| 10.22.6.115 | 57601 | Unknown | Mostly zero values |
|
||||
|
||||
### Packet Sizes
|
||||
|
||||
The data master (198.18.1.170) sends packets of varying sizes:
|
||||
|
||||
| Size (bytes) | Frequency | Contents |
|
||||
|--------------|-----------|----------|
|
||||
| 16 | Low | Minimal/heartbeat |
|
||||
| 54 | Low | Short messages |
|
||||
| 91-92 | Medium | Status/heartbeat |
|
||||
| 344 | Medium | Partial sensor data |
|
||||
| 446 | Medium | Sensor data |
|
||||
| 788-903 | Medium | Extended sensor data |
|
||||
| 1003 | Medium | Extended sensor data |
|
||||
| 1810-2056 | High | Full navigation data including GPS |
|
||||
|
||||
---
|
||||
|
||||
## Packet Structure
|
||||
|
||||
### Fixed Header (20 bytes)
|
||||
|
||||
All packets begin with a 20-byte fixed header before the protobuf payload:
|
||||
|
||||
```
|
||||
Offset Size Description
|
||||
------ ---- -----------
|
||||
0x0000 8 Packet identifier (00 00 00 00 00 00 00 01)
|
||||
0x0008 4 Source ID
|
||||
0x000C 4 Message type indicator
|
||||
0x0010 4 Payload length
|
||||
```
|
||||
|
||||
**Protobuf payload starts at offset 0x14 (20 decimal).**
|
||||
|
||||
### Protobuf Message Structure
|
||||
|
||||
The payload uses Google Protocol Buffers wire format. Top-level fields:
|
||||
|
||||
```
|
||||
Field 1 (length) - Device Info (name, serial number)
|
||||
Field 2 (length) - GPS/Position Data
|
||||
├─ Field 1 (fixed64/double) - Latitude
|
||||
└─ Field 2 (fixed64/double) - Longitude
|
||||
|
||||
Field 3 (length) - Heading Block
|
||||
└─ Field 2 (fixed32/float) - Heading (radians)
|
||||
|
||||
Field 5 (length) - SOG/COG Navigation Data (86-92 byte packets)
|
||||
├─ Field 1 (fixed32/float) - COG Course Over Ground (radians)
|
||||
├─ Field 3 (fixed32/float) - Unknown constant (0.05)
|
||||
├─ Field 4 (fixed32/float) - Unknown constant (0.1)
|
||||
├─ Field 5 (fixed32/float) - SOG Speed Over Ground (m/s)
|
||||
├─ Field 6 (fixed32/float) - Secondary angle (radians) - possibly heading
|
||||
└─ Field 7 (fixed32/float) - Unknown constant (11.93)
|
||||
|
||||
Field 7 (length) - Depth Block (large packets only)
|
||||
└─ Field 1 (fixed32/float) - Depth (meters)
|
||||
|
||||
Field 13 (length) - Wind/Navigation Data
|
||||
├─ Field 4 (fixed32/float) - True Wind Direction (radians)
|
||||
├─ Field 5 (fixed32/float) - True Wind Speed (m/s)
|
||||
└─ Field 6 (fixed32/float) - Apparent Wind Speed (m/s)
|
||||
|
||||
Field 14 (repeated) - Engine Data
|
||||
├─ Field 1 (varint) - Engine ID (0=Port, 1=Starboard)
|
||||
└─ Field 3 (length) - Engine Sensor Data
|
||||
└─ Field 4 (fixed32/float) - Battery Voltage (volts)
|
||||
|
||||
Field 15 (length) - Environment Data
|
||||
├─ Field 1 (fixed32/float) - Barometric Pressure (Pascals)
|
||||
├─ Field 3 (fixed32/float) - Air Temperature (Kelvin)
|
||||
└─ Field 9 (fixed32/float) - Water Temperature (Kelvin)
|
||||
|
||||
Field 16 (repeated) - Tank Data
|
||||
├─ Field 1 (varint) - Tank ID
|
||||
├─ Field 2 (varint) - Status/Flag
|
||||
└─ Field 3 (fixed32/float) - Tank Level (percentage)
|
||||
|
||||
Field 20 (repeated) - House Battery Data
|
||||
├─ Field 1 (varint) - Battery ID (11=Aft, 13=Stern)
|
||||
└─ Field 3 (fixed32/float) - Voltage (volts)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Wire Format Details
|
||||
|
||||
Raymarine uses Protocol Buffers with these wire types:
|
||||
|
||||
| Wire Type | Name | Size | Usage |
|
||||
|-----------|------|------|-------|
|
||||
| 0 | Varint | Variable | IDs, counts, enums, status flags |
|
||||
| 1 | Fixed64 | 8 bytes | High-precision values (GPS coordinates) |
|
||||
| 2 | Length-delimited | Variable | Nested messages, byte strings |
|
||||
| 5 | Fixed32 | 4 bytes | Floats (angles, speeds, voltages) |
|
||||
|
||||
### Tag Format
|
||||
Each field is prefixed by a tag byte: `(field_number << 3) | wire_type`
|
||||
|
||||
Examples:
|
||||
- `0x09` = Field 1, wire type 1 (fixed64)
|
||||
- `0x11` = Field 2, wire type 1 (fixed64)
|
||||
- `0x15` = Field 2, wire type 5 (fixed32)
|
||||
- `0x1d` = Field 3, wire type 5 (fixed32)
|
||||
|
||||
### Unit Conventions
|
||||
|
||||
| Measurement | Raw Unit | Conversion |
|
||||
|-------------|----------|------------|
|
||||
| Latitude/Longitude | Decimal degrees | Direct (WGS84) |
|
||||
| Angles (heading, wind) | Radians | × 57.2957795131 = degrees |
|
||||
| Wind speed | m/s | × 1.94384449 = knots |
|
||||
| Depth | Meters | ÷ 0.3048 = feet |
|
||||
| Temperature | Kelvin | − 273.15 = Celsius |
|
||||
| Barometric Pressure | Pascals | × 0.01 = mbar (hPa) |
|
||||
| Tank levels | Percentage | 0-100% direct |
|
||||
| Voltage | Volts | Direct value |
|
||||
|
||||
---
|
||||
|
||||
## Field Extraction Methods
|
||||
|
||||
### GPS Position ✅ RELIABLE
|
||||
|
||||
**Location:** Field 2.1 (latitude), Field 2.2 (longitude)
|
||||
|
||||
```python
|
||||
# Parse Field 2 as nested message, then extract:
|
||||
Field 2.1 (fixed64/double) → Latitude in decimal degrees
|
||||
Field 2.2 (fixed64/double) → Longitude in decimal degrees
|
||||
|
||||
# Validation
|
||||
-90 ≤ latitude ≤ 90
|
||||
-180 ≤ longitude ≤ 180
|
||||
abs(lat) > 0.1 or abs(lon) > 0.1 # Not at null island
|
||||
```
|
||||
|
||||
**Example decode:**
|
||||
```
|
||||
Hex: 09 cf 20 f4 22 c9 ee 38 40 11 b4 6f 93 f6 2b 28 54 c0
|
||||
| | | |
|
||||
| +-- Latitude double | +-- Longitude double
|
||||
+-- Field 1 tag +-- Field 2 tag
|
||||
|
||||
Latitude: 24.932757° N
|
||||
Longitude: -80.627683° W
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### SOG (Speed Over Ground) ✅ RELIABLE
|
||||
|
||||
**Location:** Field 5.5
|
||||
|
||||
```python
|
||||
Field 5.5 (fixed32/float) → SOG in meters per second
|
||||
|
||||
# Conversion
|
||||
sog_knots = sog_ms × 1.94384449
|
||||
|
||||
# Validation
|
||||
0 ≤ sog ≤ 50 (m/s, roughly 0-100 knots)
|
||||
```
|
||||
|
||||
**Notes:**
|
||||
- Found in 86-92 byte packets
|
||||
- At dock, value is near zero (~0.01 m/s = 0.02 kts)
|
||||
- Derived from GPS, so requires GPS lock
|
||||
|
||||
---
|
||||
|
||||
### COG (Course Over Ground) ✅ RELIABLE
|
||||
|
||||
**Location:** Field 5.1
|
||||
|
||||
```python
|
||||
Field 5.1 (fixed32/float) → COG in radians
|
||||
|
||||
# Conversion
|
||||
cog_degrees = (radians × 57.2957795131) % 360
|
||||
|
||||
# Validation
|
||||
0 ≤ radians ≤ 6.5 (approximately 0 to 2π)
|
||||
```
|
||||
|
||||
**Notes:**
|
||||
- Found in 86-92 byte packets
|
||||
- At dock/low speed, COG jumps randomly (GPS noise when stationary)
|
||||
- Field 5.6 also contains an angle that varies similarly (possibly heading-from-GPS)
|
||||
|
||||
---
|
||||
|
||||
### Field 5 Complete Structure
|
||||
|
||||
| Subfield | Wire Type | Purpose | Notes |
|
||||
|----------|-----------|---------|-------|
|
||||
| 5 | f64 | Unknown | Often zero |
|
||||
| **5.1** | f32 | **COG** (radians) | Course Over Ground |
|
||||
| 5.3 | f32 | Unknown | Constant 0.05 |
|
||||
| 5.4 | f32 | Unknown | Constant 0.1 |
|
||||
| **5.5** | f32 | **SOG** (m/s) | Speed Over Ground |
|
||||
| 5.6 | f32 | Secondary angle | Varies like COG |
|
||||
| 5.7 | f32 | Unknown | Constant 11.93 |
|
||||
|
||||
---
|
||||
|
||||
### Compass Heading ⚠️ VARIABLE
|
||||
|
||||
**Location:** Field 3.2
|
||||
|
||||
```python
|
||||
Field 3.2 (fixed32/float) → Heading in radians
|
||||
|
||||
# Conversion
|
||||
heading_degrees = radians × 57.2957795131
|
||||
heading_degrees = heading_degrees % 360
|
||||
|
||||
# Validation
|
||||
0 ≤ radians ≤ 6.5 (approximately 0 to 2π)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Wind Data ⚠️ VARIABLE
|
||||
|
||||
**Location:** Field 13.4, 13.5, 13.6
|
||||
|
||||
```python
|
||||
Field 13.4 (fixed32/float) → True Wind Direction (radians)
|
||||
Field 13.5 (fixed32/float) → True Wind Speed (m/s)
|
||||
Field 13.6 (fixed32/float) → Apparent Wind Speed (m/s)
|
||||
|
||||
# Conversions
|
||||
direction_deg = radians × 57.2957795131
|
||||
speed_kts = speed_ms × 1.94384449
|
||||
|
||||
# Validation
|
||||
0 ≤ angle ≤ 6.5 (radians)
|
||||
0 ≤ speed ≤ 100 (m/s)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Depth ⚠️ VARIABLE
|
||||
|
||||
**Location:** Field 7.1 (only in larger packets 1472B+)
|
||||
|
||||
```python
|
||||
Field 7.1 (fixed32/float) → Depth in meters
|
||||
|
||||
# Conversion
|
||||
depth_feet = depth_meters / 0.3048
|
||||
|
||||
# Validation
|
||||
0 < depth ≤ 1000 (meters)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Barometric Pressure ✅ RELIABLE
|
||||
|
||||
**Location:** Field 15.1
|
||||
|
||||
```python
|
||||
Field 15.1 (fixed32/float) → Barometric Pressure (Pascals)
|
||||
|
||||
# Conversion
|
||||
pressure_mbar = pressure_pa * 0.01
|
||||
pressure_inhg = pressure_mbar * 0.02953
|
||||
|
||||
# Validation
|
||||
87000 ≤ value ≤ 108400 Pa (870-1084 mbar)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Temperature ✅ RELIABLE
|
||||
|
||||
**Location:** Field 15.3 (air), Field 15.9 (water)
|
||||
|
||||
```python
|
||||
Field 15.3 (fixed32/float) → Air Temperature (Kelvin)
|
||||
Field 15.9 (fixed32/float) → Water Temperature (Kelvin)
|
||||
|
||||
# Conversion
|
||||
temp_celsius = temp_kelvin - 273.15
|
||||
temp_fahrenheit = temp_celsius × 9/5 + 32
|
||||
|
||||
# Validation
|
||||
Air: 200 ≤ value ≤ 350 K (-73°C to 77°C)
|
||||
Water: 270 ≤ value ≤ 320 K (-3°C to 47°C)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Tank Levels ✅ RELIABLE
|
||||
|
||||
**Location:** Field 16 (repeated)
|
||||
|
||||
```python
|
||||
Field 16 (repeated messages):
|
||||
Field 1 (varint) → Tank ID
|
||||
Field 2 (varint) → Status flag
|
||||
Field 3 (fixed32/float) → Level percentage
|
||||
|
||||
# Validation
|
||||
0 ≤ level ≤ 100 (percentage)
|
||||
```
|
||||
|
||||
**Tank ID Mapping:**
|
||||
|
||||
| ID | Name | Capacity | Notes |
|
||||
|----|------|----------|-------|
|
||||
| 1 | Starboard Fuel | 265 gal | Has explicit ID |
|
||||
| 2 | Port Fuel | 265 gal | Inferred (no ID, no status) |
|
||||
| 10 | Forward Water | 90 gal | |
|
||||
| 11 | Aft Water | 90 gal | |
|
||||
| 100 | Black Water | 53 gal | Inferred (status=5) |
|
||||
|
||||
**Inference Logic:**
|
||||
```python
|
||||
if tank_id is None:
|
||||
if status == 5:
|
||||
tank_id = 100 # Black/waste water
|
||||
elif status is None:
|
||||
tank_id = 2 # Port Fuel (only tank with no ID or status)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### House Batteries ✅ RELIABLE
|
||||
|
||||
**Location:** Field 20 (repeated)
|
||||
|
||||
```python
|
||||
Field 20 (repeated messages):
|
||||
Field 1 (varint) → Battery ID
|
||||
Field 3 (fixed32/float) → Voltage (volts)
|
||||
|
||||
# Validation
|
||||
10 ≤ voltage ≤ 60 (covers 12V, 24V, 48V systems)
|
||||
```
|
||||
|
||||
**Battery ID Mapping:**
|
||||
|
||||
| ID | Name | Expected Voltage |
|
||||
|----|------|------------------|
|
||||
| 11 | Aft House | ~26.3V (24V system) |
|
||||
| 13 | Stern House | ~27.2V (24V system) |
|
||||
|
||||
---
|
||||
|
||||
### Engine Batteries ✅ RELIABLE
|
||||
|
||||
**Location:** Field 14.3.4 (deep nested - 3 levels)
|
||||
|
||||
```python
|
||||
Field 14 (repeated messages):
|
||||
Field 1 (varint) → Engine ID (0=Port, 1=Starboard)
|
||||
Field 3 (length/nested message):
|
||||
Field 4 (fixed32/float) → Battery voltage (volts)
|
||||
|
||||
# Extraction requires parsing Field 14.3 as nested protobuf
|
||||
# to extract Field 4 (voltage)
|
||||
|
||||
# Battery ID calculation
|
||||
battery_id = 1000 + engine_id
|
||||
# Port Engine = 1000, Starboard Engine = 1001
|
||||
|
||||
# Validation
|
||||
10 ≤ voltage ≤ 60 (volts)
|
||||
```
|
||||
|
||||
**Engine Battery Mapping:**
|
||||
|
||||
| Engine ID | Battery ID | Name |
|
||||
|-----------|------------|------|
|
||||
| 0 | 1000 | Port Engine |
|
||||
| 1 | 1001 | Starboard Engine |
|
||||
|
||||
---
|
||||
|
||||
## Technical Challenges
|
||||
|
||||
### 1. No Schema Available
|
||||
|
||||
Protocol Buffers normally use a `.proto` schema file to define message structure. Without Raymarine's proprietary schema, we cannot:
|
||||
- Know message type identifiers
|
||||
- Understand field semantics
|
||||
- Differentiate between message types
|
||||
|
||||
### 2. Field Number Collision
|
||||
|
||||
The same protobuf field number means different things in different message types:
|
||||
- Field 4 at one offset might be wind speed
|
||||
- Field 4 at another offset might be something else entirely
|
||||
|
||||
### 3. Variable Packet Structure
|
||||
|
||||
Packets of different sizes have completely different internal layouts:
|
||||
- GPS appears at offset ~0x0032 in large packets
|
||||
- Sensor data appears at different offsets depending on packet size
|
||||
- Nested submessages add complexity
|
||||
|
||||
### 4. No Message Type Markers
|
||||
|
||||
Unlike some protocols, there's no obvious message type identifier in the packet header that would allow us to switch parsing logic based on message type.
|
||||
|
||||
### 5. Mixed Precision
|
||||
|
||||
Some values use 64-bit doubles, others use 32-bit floats. Both can appear in the same packet, and the same logical value (e.g., an angle) might be encoded differently in different message types.
|
||||
|
||||
---
|
||||
|
||||
## Recommended Approach for Reliable Decoding
|
||||
|
||||
### Option 1: GPS-Anchored Parsing
|
||||
|
||||
1. Find GPS using the reliable `0x09`/`0x11` pattern
|
||||
2. Use GPS offset as anchor point
|
||||
3. Extract values at fixed byte offsets relative to GPS
|
||||
4. Maintain separate offset tables for each packet size
|
||||
|
||||
### Option 2: Packet Size Dispatch
|
||||
|
||||
1. Identify packet by size
|
||||
2. Apply size-specific parsing rules
|
||||
3. Use absolute byte offsets (not field numbers)
|
||||
4. Maintain a mapping table: `(packet_size, offset) → sensor_type`
|
||||
|
||||
### Option 3: Value Correlation
|
||||
|
||||
1. Collect all extracted values
|
||||
2. Compare against known ground truth (displayed values on MFD)
|
||||
3. Use statistical correlation to identify correct mappings
|
||||
4. Build confidence scores for each mapping
|
||||
|
||||
---
|
||||
|
||||
## Tools Included
|
||||
|
||||
### Main Decoders
|
||||
|
||||
| Tool | Purpose |
|
||||
|------|---------|
|
||||
| `protobuf_decoder.py` | **Primary decoder** - all fields via proper protobuf parsing |
|
||||
| `raymarine_decoder.py` | High-level decoder with live dashboard display |
|
||||
|
||||
### Discovery & Debug Tools
|
||||
|
||||
| Tool | Purpose |
|
||||
|------|---------|
|
||||
| `battery_debug.py` | Deep nesting parser for Field 14.3.4 (engine batteries) |
|
||||
| `battery_finder.py` | Scans multicast groups for voltage-like values |
|
||||
| `tank_debug.py` | Raw Field 16 entry inspection |
|
||||
| `tank_finder.py` | Searches for tank level percentages |
|
||||
| `field_debugger.py` | Deep analysis of packet fields |
|
||||
|
||||
### Analysis Tools
|
||||
|
||||
| Tool | Purpose |
|
||||
|------|---------|
|
||||
| `analyze_structure.py` | Packet structure analysis |
|
||||
| `field_mapping.py` | Documents the protobuf structure |
|
||||
| `protobuf_parser.py` | Lower-level wire format decoder |
|
||||
| `watch_field.py` | Monitor specific field values over time |
|
||||
|
||||
### Wind/Heading Finders
|
||||
|
||||
| Tool | Purpose |
|
||||
|------|---------|
|
||||
| `wind_finder.py` | Searches for wind speed values |
|
||||
| `find_twd.py` | Searches for true wind direction |
|
||||
| `find_heading_vs_twd.py` | Compares heading and TWD values |
|
||||
| `find_consistent_heading.py` | Identifies stable heading fields |
|
||||
|
||||
## Usage
|
||||
|
||||
```bash
|
||||
# Run the primary protobuf decoder (live network)
|
||||
python protobuf_decoder.py -i YOUR_VLAN_IP
|
||||
|
||||
# JSON output for integration
|
||||
python protobuf_decoder.py -i YOUR_VLAN_IP --json
|
||||
|
||||
# Decode from pcap file (offline analysis)
|
||||
python protobuf_decoder.py --pcap raymarine_sample.pcap
|
||||
|
||||
# Debug battery extraction
|
||||
python battery_debug.py --pcap raymarine_sample.pcap
|
||||
|
||||
# Debug tank data
|
||||
python tank_debug.py --pcap raymarine_sample.pcap
|
||||
```
|
||||
|
||||
Replace `YOUR_VLAN_IP` with your interface IP on the Raymarine VLAN (e.g., `198.18.5.5`).
|
||||
|
||||
**No external dependencies required** - uses only Python standard library.
|
||||
|
||||
---
|
||||
|
||||
## Sample Output
|
||||
|
||||
```
|
||||
============================================================
|
||||
RAYMARINE DECODER (Protobuf) 17:36:01
|
||||
============================================================
|
||||
GPS: 24.932652, -80.627569
|
||||
Heading: 35.2°
|
||||
Wind: 14.6 kts @ 68.5° (true)
|
||||
Depth: 7.5 ft (2.3 m)
|
||||
Temp: Air 24.8°C / 76.6°F, Water 26.2°C / 79.2°F
|
||||
Tanks: Stbd Fuel: 75.2% (199gal), Port Fuel: 68.1% (180gal), ...
|
||||
Batts: Aft House: 26.3V, Stern House: 27.2V, Port Engine: 26.5V
|
||||
------------------------------------------------------------
|
||||
Packets: 4521 Decoded: 4312 Uptime: 85.2s
|
||||
============================================================
|
||||
```
|
||||
|
||||
### JSON Output
|
||||
|
||||
```json
|
||||
{
|
||||
"timestamp": "2025-12-23T17:36:01.123456",
|
||||
"position": {"latitude": 24.932652, "longitude": -80.627569},
|
||||
"navigation": {"heading_deg": 35.2, "cog_deg": null, "sog_kts": null},
|
||||
"wind": {"true_direction_deg": 68.5, "true_speed_kts": 14.6, ...},
|
||||
"depth": {"feet": 7.5, "meters": 2.3},
|
||||
"temperature": {"water_c": 26.2, "air_c": 24.8},
|
||||
"tanks": {
|
||||
"1": {"name": "Stbd Fuel", "level_pct": 75.2, "capacity_gal": 265},
|
||||
"2": {"name": "Port Fuel", "level_pct": 68.1, "capacity_gal": 265}
|
||||
},
|
||||
"batteries": {
|
||||
"11": {"name": "Aft House", "voltage_v": 26.3},
|
||||
"13": {"name": "Stern House", "voltage_v": 27.2},
|
||||
"1000": {"name": "Port Engine", "voltage_v": 26.5}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Future Work
|
||||
|
||||
1. ~~**SOG/COG extraction**~~ ✅ **DONE** - Field 5.5 (SOG) and Field 5.1 (COG) identified
|
||||
2. **Apparent Wind Angle** - AWA field location to be confirmed
|
||||
3. **Additional engine data** - RPM, fuel flow, oil pressure likely in Field 14
|
||||
4. **Field 5.3, 5.4, 5.7** - Unknown constants (0.05, 0.1, 11.93) - purpose TBD
|
||||
5. **Investigate SignalK** - The MFDs expose HTTP on port 8080 which may provide a cleaner API
|
||||
6. **NMEA TCP/UDP** - Check if standard NMEA is available on other ports (10110, 2000, etc.)
|
||||
|
||||
---
|
||||
|
||||
## References
|
||||
|
||||
- [Protocol Buffers Encoding](https://developers.google.com/protocol-buffers/docs/encoding)
|
||||
- [Raymarine LightHouse OS](https://www.raymarine.com/lighthouse/)
|
||||
- Test location: Florida Keys (24° 55' N, 80° 37' W)
|
||||
|
||||
---
|
||||
|
||||
## License
|
||||
|
||||
This reverse-engineering effort is for personal/educational use. The Raymarine protocol is proprietary.
|
||||
194
axiom-nmea/README.md
Normal file
194
axiom-nmea/README.md
Normal file
@@ -0,0 +1,194 @@
|
||||
# Raymarine LightHouse Protocol Decoder
|
||||
|
||||
A Python toolkit for decoding sensor data from Raymarine AXIOM/LightHouse multicast networks.
|
||||
|
||||
> **See [PROTOCOL.md](PROTOCOL.md) for detailed protocol analysis and field mappings.**
|
||||
|
||||
## Protocol Discovery Summary
|
||||
|
||||
**Key Finding: Raymarine uses Google Protocol Buffers over UDP multicast, NOT standard NMEA 0183.**
|
||||
|
||||
### Decoding Status
|
||||
|
||||
| Sensor | Status | Field | Notes |
|
||||
|--------|--------|-------|-------|
|
||||
| **GPS Position** | Reliable | 2.1, 2.2 | 64-bit doubles, decimal degrees |
|
||||
| Compass Heading | Variable | 3.2 | 32-bit float, radians |
|
||||
| Wind Direction | Variable | 13.4 | 32-bit float, radians (true) |
|
||||
| Wind Speed | Variable | 13.5, 13.6 | 32-bit float, m/s (true/apparent) |
|
||||
| Depth | Variable | 7.1 | 32-bit float, meters |
|
||||
| **Water Temperature** | Reliable | 15.9 | 32-bit float, Kelvin |
|
||||
| Air Temperature | Variable | 15.3 | 32-bit float, Kelvin |
|
||||
| **Tank Levels** | Reliable | 16 | Repeated field with ID, status, level % |
|
||||
| **House Batteries** | Reliable | 20 | Repeated field with ID and voltage |
|
||||
| **Engine Batteries** | Reliable | 14.3.4 | Deep nested (3 levels) with voltage |
|
||||
|
||||
### Why Variable?
|
||||
|
||||
Without Raymarine's proprietary protobuf schema, we're reverse-engineering blind:
|
||||
- Same field numbers mean different things in different packet types
|
||||
- Packet structure varies by size (344 bytes vs 2056 bytes)
|
||||
- No message type identifiers in headers
|
||||
|
||||
See [PROTOCOL.md](PROTOCOL.md) for the full technical analysis.
|
||||
|
||||
### Multicast Groups & Sources
|
||||
|
||||
| Group | Port | Source IP | Device | Data |
|
||||
|-------|------|-----------|--------|------|
|
||||
| 226.192.206.98 | 2561 | 10.22.6.115 | Unknown | Navigation (mostly zeros) |
|
||||
| 226.192.206.99 | 2562 | 198.18.1.170 | AXIOM 12 (Data Master) | Heartbeat/status |
|
||||
| 226.192.206.102 | 2565 | 198.18.1.170 | AXIOM 12 (Data Master) | **Primary sensor data** |
|
||||
| 226.192.219.0 | 3221 | 198.18.2.191 | AXIOM PLUS 12 RV | Display sync |
|
||||
|
||||
**Primary Data Source:** `198.18.1.170` broadcasts GPS, wind, depth, heading, temperatures, tank levels, and battery voltages.
|
||||
|
||||
### Data Encoding
|
||||
- Wire format: Protobuf (fixed64 doubles + fixed32 floats)
|
||||
- Angles: **Radians** (multiply by 57.2958 for degrees)
|
||||
- Wind speed: **m/s** (multiply by 1.94384 for knots)
|
||||
- Depth: **Meters** (divide by 0.3048 for feet)
|
||||
- Temperature: **Kelvin** (subtract 273.15 for Celsius)
|
||||
- Tank levels: **Percentage** (0-100%)
|
||||
- Battery voltage: **Volts** (direct value)
|
||||
|
||||
## Quick Start
|
||||
|
||||
**No installation required** - clone and run. Uses only Python standard library.
|
||||
|
||||
```bash
|
||||
git clone https://github.com/terbonium/axiom-nmea.git
|
||||
cd axiom-nmea
|
||||
```
|
||||
|
||||
Optional: Install as a package for use in your own projects:
|
||||
```bash
|
||||
pip install -e .
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
```bash
|
||||
# Run the decoder with live dashboard (requires network access to Raymarine VLAN)
|
||||
python debug/protobuf_decoder.py -i YOUR_VLAN_IP
|
||||
|
||||
# JSON output (for integration with other systems)
|
||||
python debug/protobuf_decoder.py -i YOUR_VLAN_IP --json
|
||||
|
||||
# Decode from pcap file (offline analysis)
|
||||
python debug/protobuf_decoder.py --pcap samples/raymarine_sample.pcap
|
||||
```
|
||||
|
||||
Replace `YOUR_VLAN_IP` with your VLAN interface IP (e.g., `198.18.5.5`).
|
||||
|
||||
## Sample Output
|
||||
|
||||
```
|
||||
============================================================
|
||||
RAYMARINE DECODER (Protobuf) 16:13:21
|
||||
============================================================
|
||||
GPS: 24.932652, -80.627569
|
||||
Heading: 35.2°
|
||||
Wind: 14.6 kts @ 68.5° (true)
|
||||
Depth: 7.5 ft (2.3 m)
|
||||
Temp: Air 24.8°C / 76.6°F, Water 26.2°C / 79.2°F
|
||||
Tanks: Stbd Fuel: 75.2% (199gal), Port Fuel: 68.1% (180gal), ...
|
||||
Batts: Aft House: 26.3V, Stern House: 27.2V, Port Engine: 26.5V
|
||||
------------------------------------------------------------
|
||||
Packets: 9444 Decoded: 8921 Uptime: 124.5s
|
||||
============================================================
|
||||
```
|
||||
|
||||
## Directory Structure
|
||||
|
||||
```
|
||||
axiom-nmea/
|
||||
├── debug/ # Debug and analysis tools (see debug/README.md)
|
||||
├── examples/ # Example applications
|
||||
│ ├── quickstart/ # Minimal library usage example
|
||||
│ ├── nmea-server-example/ # TCP NMEA sentence server
|
||||
│ ├── pcap-to-nmea/ # Convert pcap to NMEA sentences
|
||||
│ ├── sensor-monitor/ # Real-time sensor update monitor
|
||||
│ ├── victron-bridge/ # Victron Venus OS integration
|
||||
│ └── windy-station/ # Windy.com weather station
|
||||
├── nmea-server/ # Dockerized NMEA server
|
||||
├── raymarine_nmea/ # Core library
|
||||
├── samples/ # Sample pcap files (not committed)
|
||||
└── PROTOCOL.md # Protocol documentation
|
||||
```
|
||||
|
||||
## Tools Included
|
||||
|
||||
All debug tools are in the `debug/` directory. See [debug/README.md](debug/README.md) for full documentation.
|
||||
|
||||
### Main Decoders
|
||||
| Tool | Description |
|
||||
|------|-------------|
|
||||
| `debug/protobuf_decoder.py` | **Primary decoder** - all fields via proper protobuf parsing |
|
||||
| `debug/raymarine_decoder.py` | Alternative decoder with live dashboard display |
|
||||
|
||||
### Discovery Tools
|
||||
| Tool | Description |
|
||||
|------|-------------|
|
||||
| `debug/battery_debug.py` | Deep nesting parser for battery fields (Field 14.3.4) |
|
||||
| `debug/battery_finder.py` | Scans multicast groups for voltage-like values |
|
||||
| `debug/tank_debug.py` | Raw Field 16 entry inspection |
|
||||
| `debug/tank_finder.py` | Searches for tank level percentages |
|
||||
| `debug/field_debugger.py` | Deep analysis of packet fields |
|
||||
|
||||
### Analysis Tools
|
||||
| Tool | Description |
|
||||
|------|-------------|
|
||||
| `debug/analyze_structure.py` | Packet structure analysis |
|
||||
| `debug/field_mapping.py` | Documents the protobuf structure |
|
||||
| `debug/protobuf_parser.py` | Lower-level wire format decoder |
|
||||
| `debug/watch_field.py` | Monitor specific field values over time |
|
||||
|
||||
### Wind/Heading Finders
|
||||
| Tool | Description |
|
||||
|------|-------------|
|
||||
| `debug/wind_finder.py` | Searches for wind speed values |
|
||||
| `debug/find_twd.py` | Searches for true wind direction |
|
||||
| `debug/find_heading_vs_twd.py` | Compares heading and TWD values |
|
||||
| `debug/find_consistent_heading.py` | Identifies stable heading fields |
|
||||
|
||||
## Network Configuration
|
||||
|
||||
### VLAN Setup
|
||||
|
||||
Your network needs access to the Raymarine VLAN to receive multicast traffic:
|
||||
|
||||
```bash
|
||||
# Check VLAN interface exists
|
||||
ip link show vlan.200
|
||||
|
||||
# If not, create it (requires VLAN support)
|
||||
sudo ip link add link eth0 name vlan.200 type vlan id 200
|
||||
sudo ip link set dev vlan.200 up
|
||||
sudo dhclient vlan.200
|
||||
```
|
||||
|
||||
### Testing Multicast Reception
|
||||
|
||||
Before running the decoder, verify you can receive the multicast traffic:
|
||||
|
||||
```bash
|
||||
# Check if multicast traffic is arriving
|
||||
tcpdump -i vlan.200 -c 10 'udp and dst net 224.0.0.0/4'
|
||||
|
||||
# Look for specific ports
|
||||
tcpdump -i vlan.200 -c 20 'udp port 2561 or udp port 2562 or udp port 2565'
|
||||
```
|
||||
|
||||
## Sample PCAP Files
|
||||
|
||||
Place your packet captures in the `samples/` directory for offline analysis:
|
||||
- `samples/raymarine_sample.pcap` - General sample data
|
||||
- `samples/raymarine_sample_TWD_62-70_HDG_29-35.pcap` - Known heading/wind angles
|
||||
- `samples/raymarine_sample_twd_69-73.pcap` - Additional wind samples
|
||||
|
||||
See [samples/README.md](samples/README.md) for capture instructions. Note: `.pcap` files are not committed to git.
|
||||
|
||||
## License
|
||||
|
||||
MIT License - Use freely for debugging your marine electronics.
|
||||
155
axiom-nmea/debug/README.md
Normal file
155
axiom-nmea/debug/README.md
Normal file
@@ -0,0 +1,155 @@
|
||||
# Debug Scripts
|
||||
|
||||
This directory contains debugging and analysis tools for reverse-engineering the Raymarine LightHouse network protocol. These scripts are used to discover field mappings, locate sensor data, and understand the protobuf structure.
|
||||
|
||||
## Protocol Analysis
|
||||
|
||||
### `protobuf_decoder.py`
|
||||
Full protobuf decoder with documented field mappings. Parses the nested protobuf structure and extracts sensor data (GPS, wind, depth, tanks, batteries, temperature).
|
||||
|
||||
```bash
|
||||
python protobuf_decoder.py -i 198.18.5.5
|
||||
python protobuf_decoder.py --pcap capture.pcap
|
||||
```
|
||||
|
||||
### `protobuf_parser.py`
|
||||
Low-level protobuf wire format parser without schema. Decodes the nested message structure to understand the protocol.
|
||||
|
||||
### `raymarine_decoder.py`
|
||||
Standalone Raymarine decoder that extracts sensor data from multicast packets. Outputs human-readable or JSON format.
|
||||
|
||||
```bash
|
||||
python raymarine_decoder.py -i 198.18.5.5
|
||||
python raymarine_decoder.py -i 198.18.5.5 --json
|
||||
python raymarine_decoder.py --pcap raymarine_sample.pcap
|
||||
```
|
||||
|
||||
### `packet_debug.py`
|
||||
Dumps raw protobuf field structure from multicast packets. Shows all top-level fields, nested structures, and decoded values.
|
||||
|
||||
```bash
|
||||
python packet_debug.py -i 198.18.5.5
|
||||
```
|
||||
|
||||
### `field_debugger.py`
|
||||
Interactive field mapper that displays all protobuf fields in a columnar format. Useful for correlating field values with real-world sensor readings.
|
||||
|
||||
```bash
|
||||
python field_debugger.py -i 192.168.1.100 # Live capture
|
||||
python field_debugger.py --pcap capture.pcap # From file
|
||||
python field_debugger.py --pcap capture.pcap -n 5 # Show 5 snapshots
|
||||
```
|
||||
|
||||
### `field_mapping.py`
|
||||
Documents the discovered field structure and validates against captured data. Shows the relationship between protobuf fields and sensor values.
|
||||
|
||||
### `analyze_structure.py`
|
||||
Analyzes packet header structure and protobuf nesting patterns. Groups packets by size and examines common headers.
|
||||
|
||||
## Sensor Finders
|
||||
|
||||
These scripts search for specific sensor values within the protobuf stream.
|
||||
|
||||
### `find_cog_sog.py`
|
||||
Searches all protobuf fields for values matching expected COG (Course Over Ground) and SOG (Speed Over Ground) ranges.
|
||||
|
||||
```bash
|
||||
python find_cog_sog.py -i 198.18.5.5 --cog-min 0 --cog-max 359 --sog-min 0 --sog-max 0.5
|
||||
python find_cog_sog.py -i 198.18.5.5 --show-all # Show ALL numeric fields
|
||||
python find_cog_sog.py -i 198.18.5.5 -f 2 # Filter to field 2 only
|
||||
```
|
||||
|
||||
### `wind_finder.py`
|
||||
Searches for wind speed and direction values in captured packets. Expects wind speed in m/s (7-12) and direction in radians (1.0-1.7).
|
||||
|
||||
### `find_twd.py`
|
||||
Searches for True Wind Direction values in a specific degree range (e.g., 69-73 degrees).
|
||||
|
||||
### `find_twd_precise.py`
|
||||
Precise TWD finder with tighter tolerance for exact value matching.
|
||||
|
||||
### `find_twd_hdg.py`
|
||||
Finds both TWD and Heading offsets using known reference values.
|
||||
|
||||
### `find_heading_vs_twd.py`
|
||||
Correlates heading and TWD candidates. At anchor pointing into wind, heading and TWD should be within ~50 degrees.
|
||||
|
||||
### `find_consistent_heading.py`
|
||||
Finds offsets that show consistent heading-like values across multiple pcap files.
|
||||
|
||||
### `pressure_finder.py`
|
||||
Locates barometric pressure data by searching for values matching a known pressure reading.
|
||||
|
||||
```bash
|
||||
python pressure_finder.py -i YOUR_INTERFACE_IP -p 1021 # Known pressure in mbar
|
||||
```
|
||||
|
||||
### `battery_finder.py`
|
||||
Scans all multicast groups for values matching expected battery voltages (12V, 24V systems).
|
||||
|
||||
```bash
|
||||
python battery_finder.py -i 198.18.5.5 -t 10
|
||||
python battery_finder.py -i 198.18.5.5 -v # Verbose mode
|
||||
```
|
||||
|
||||
### `tank_finder.py`
|
||||
Scans multicast groups for values matching expected tank level percentages.
|
||||
|
||||
## Field-Specific Debug Tools
|
||||
|
||||
### `battery_debug.py`
|
||||
Dumps raw protobuf entries to find battery data fields. Supports deep nesting analysis (e.g., Field 14.3.4 for engine battery).
|
||||
|
||||
```bash
|
||||
python battery_debug.py -i 198.18.5.5 -t 5
|
||||
python battery_debug.py -i 198.18.5.5 -f 14 # Focus on Field 14
|
||||
```
|
||||
|
||||
### `tank_debug.py`
|
||||
Dumps raw Field 16 entries to discover tank IDs and status values.
|
||||
|
||||
### `debug_wind.py`
|
||||
Examines actual packet bytes at known wind data offsets.
|
||||
|
||||
### `debug_decode.py`
|
||||
Simulates the wind extraction logic and shows packet size distribution.
|
||||
|
||||
### `debug_field13.py`
|
||||
Debugs Field 13 (Wind/Navigation) extraction across different packet sizes.
|
||||
|
||||
### `watch_field.py`
|
||||
Monitors a specific protobuf field path across incoming packets.
|
||||
|
||||
```bash
|
||||
python watch_field.py -i 192.168.1.100 --field 7.1 # Watch depth
|
||||
python watch_field.py --pcap capture.pcap --field 13.4 # Watch TWD
|
||||
```
|
||||
|
||||
## Offset Comparison Tools
|
||||
|
||||
### `compare_offsets.py`
|
||||
Compares old fixed-byte offsets vs protobuf field-based offsets for wind direction.
|
||||
|
||||
### `compare_heading_both_pcaps.py`
|
||||
Compares heading and TWD candidates between two different pcap files to validate consistency.
|
||||
|
||||
### `check_006b.py`
|
||||
Thoroughly checks offset 0x006b across all packets of various sizes.
|
||||
|
||||
## Usage Notes
|
||||
|
||||
Most scripts require either:
|
||||
- `-i, --interface` - The IP address of the interface connected to the Raymarine network
|
||||
- `--pcap` - Path to a captured pcap file for offline analysis
|
||||
|
||||
Common multicast groups:
|
||||
- `226.192.206.102:2565` - Primary sensor data
|
||||
- `226.192.206.98:2561` - Navigation sensors
|
||||
- `239.2.1.1:2154` - Additional sensor data (tanks, engines)
|
||||
|
||||
## Data Formats
|
||||
|
||||
- Angles are stored in **radians** (multiply by 57.2958 for degrees)
|
||||
- Speeds are stored in **m/s** (multiply by 1.94384 for knots)
|
||||
- Depth is stored in **meters**
|
||||
- Temperature is stored in **Kelvin** (subtract 273.15 for Celsius)
|
||||
94
axiom-nmea/debug/analyze_structure.py
Normal file
94
axiom-nmea/debug/analyze_structure.py
Normal file
@@ -0,0 +1,94 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Analyze packet structure to understand why offsets vary.
|
||||
Look for common headers and protobuf nesting patterns.
|
||||
"""
|
||||
|
||||
import struct
|
||||
|
||||
def read_pcap(filename):
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append(pkt_data[payload_start:])
|
||||
return packets
|
||||
|
||||
packets = read_pcap("raymarine_sample_TWD_62-70_HDG_29-35.pcap")
|
||||
|
||||
# Group by size
|
||||
by_size = {}
|
||||
for pkt in packets:
|
||||
pkt_len = len(pkt)
|
||||
if pkt_len not in by_size:
|
||||
by_size[pkt_len] = []
|
||||
by_size[pkt_len].append(pkt)
|
||||
|
||||
print("=" * 70)
|
||||
print("PACKET HEADER ANALYSIS")
|
||||
print("=" * 70)
|
||||
|
||||
for pkt_len in sorted(by_size.keys()):
|
||||
if pkt_len < 100:
|
||||
continue
|
||||
|
||||
pkts = by_size[pkt_len][:3] # First 3 packets of each size
|
||||
|
||||
print(f"\n{'='*70}")
|
||||
print(f"PACKET SIZE: {pkt_len} bytes ({len(by_size[pkt_len])} total)")
|
||||
print("=" * 70)
|
||||
|
||||
pkt = pkts[0]
|
||||
|
||||
# Show first 128 bytes as hex
|
||||
print("\nFirst 128 bytes (hex):")
|
||||
for row in range(0, min(128, pkt_len), 16):
|
||||
hex_str = ' '.join(f'{b:02x}' for b in pkt[row:row+16])
|
||||
ascii_str = ''.join(chr(b) if 32 <= b < 127 else '.' for b in pkt[row:row+16])
|
||||
print(f" 0x{row:04x}: {hex_str:<48} {ascii_str}")
|
||||
|
||||
# Look for the GPS pattern (0x09 followed by lat, 0x11 followed by lon)
|
||||
gps_offset = None
|
||||
for i in range(0x20, min(pkt_len - 18, 0x60)):
|
||||
if pkt[i] == 0x09 and i + 9 < pkt_len and pkt[i + 9] == 0x11:
|
||||
lat = struct.unpack('<d', pkt[i+1:i+9])[0]
|
||||
lon = struct.unpack('<d', pkt[i+10:i+18])[0]
|
||||
if -90 <= lat <= 90 and -180 <= lon <= 180 and abs(lat) > 1:
|
||||
gps_offset = i
|
||||
print(f"\n GPS found at offset 0x{i:04x}: {lat:.6f}, {lon:.6f}")
|
||||
break
|
||||
|
||||
# Look for string patterns (device name)
|
||||
for i in range(0x10, min(pkt_len - 10, 0x40)):
|
||||
if pkt[i:i+5] == b'AXIOM':
|
||||
print(f" 'AXIOM' string at offset 0x{i:04x}")
|
||||
# Show surrounding context
|
||||
end = min(i + 30, pkt_len)
|
||||
print(f" Context: {pkt[i:end]}")
|
||||
break
|
||||
|
||||
# Analyze protobuf wire types at key offsets
|
||||
print(f"\n Protobuf tags at key offsets:")
|
||||
for offset in [0x0070, 0x00a0, 0x00a7, 0x00c5, 0x00fc]:
|
||||
if offset < pkt_len:
|
||||
tag = pkt[offset]
|
||||
wire_type = tag & 0x07
|
||||
field_num = tag >> 3
|
||||
wire_names = {0: 'varint', 1: 'fixed64', 2: 'length', 5: 'fixed32'}
|
||||
print(f" 0x{offset:04x}: tag=0x{tag:02x} (field {field_num}, {wire_names.get(wire_type, '?')})")
|
||||
453
axiom-nmea/debug/battery_debug.py
Normal file
453
axiom-nmea/debug/battery_debug.py
Normal file
@@ -0,0 +1,453 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Battery Debug - Dump raw protobuf entries to find battery data fields.
|
||||
|
||||
Scans for fields that might contain battery data (voltage, current, SoC).
|
||||
Supports deep nesting (e.g., Field 14.3.4 for engine battery).
|
||||
"""
|
||||
|
||||
import struct
|
||||
import socket
|
||||
import time
|
||||
import threading
|
||||
from typing import Dict, List, Any, Optional
|
||||
|
||||
WIRE_VARINT = 0
|
||||
WIRE_FIXED64 = 1
|
||||
WIRE_LENGTH = 2
|
||||
WIRE_FIXED32 = 5
|
||||
|
||||
HEADER_SIZE = 20
|
||||
|
||||
# Fields to specifically look for battery data
|
||||
# Field 14 = Engine data (contains battery at 14.3.4)
|
||||
# Field 20 = House batteries
|
||||
BATTERY_CANDIDATE_FIELDS = {14, 17, 18, 19, 20, 21, 22, 23, 24, 25}
|
||||
|
||||
MULTICAST_GROUPS = [
|
||||
("226.192.206.102", 2565), # Main sensor data
|
||||
("239.2.1.1", 2154), # May contain additional sensor data
|
||||
]
|
||||
|
||||
|
||||
class ProtobufParser:
|
||||
def __init__(self, data: bytes):
|
||||
self.data = data
|
||||
self.pos = 0
|
||||
|
||||
def remaining(self):
|
||||
return len(self.data) - self.pos
|
||||
|
||||
def read_varint(self) -> int:
|
||||
result = 0
|
||||
shift = 0
|
||||
while self.pos < len(self.data):
|
||||
byte = self.data[self.pos]
|
||||
self.pos += 1
|
||||
result |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
return result
|
||||
|
||||
def read_fixed32(self) -> bytes:
|
||||
val = self.data[self.pos:self.pos + 4]
|
||||
self.pos += 4
|
||||
return val
|
||||
|
||||
def read_fixed64(self) -> bytes:
|
||||
val = self.data[self.pos:self.pos + 8]
|
||||
self.pos += 8
|
||||
return val
|
||||
|
||||
def read_length_delimited(self) -> bytes:
|
||||
length = self.read_varint()
|
||||
val = self.data[self.pos:self.pos + length]
|
||||
self.pos += length
|
||||
return val
|
||||
|
||||
def parse_all_fields(self) -> Dict[int, List[Any]]:
|
||||
"""Parse and collect all fields, grouping repeated fields."""
|
||||
fields = {}
|
||||
|
||||
while self.pos < len(self.data):
|
||||
if self.remaining() < 1:
|
||||
break
|
||||
try:
|
||||
start_pos = self.pos
|
||||
tag = self.read_varint()
|
||||
field_num = tag >> 3
|
||||
wire_type = tag & 0x07
|
||||
|
||||
if field_num == 0 or field_num > 1000:
|
||||
break
|
||||
|
||||
if wire_type == WIRE_VARINT:
|
||||
value = ('varint', self.read_varint())
|
||||
elif wire_type == WIRE_FIXED64:
|
||||
raw = self.read_fixed64()
|
||||
try:
|
||||
d = struct.unpack('<d', raw)[0]
|
||||
value = ('double', d, raw.hex())
|
||||
except:
|
||||
value = ('fixed64', raw.hex())
|
||||
elif wire_type == WIRE_LENGTH:
|
||||
raw = self.read_length_delimited()
|
||||
value = ('length', raw)
|
||||
elif wire_type == WIRE_FIXED32:
|
||||
raw = self.read_fixed32()
|
||||
try:
|
||||
f = struct.unpack('<f', raw)[0]
|
||||
value = ('float', f, raw.hex())
|
||||
except:
|
||||
value = ('fixed32', raw.hex())
|
||||
else:
|
||||
break
|
||||
|
||||
if field_num not in fields:
|
||||
fields[field_num] = []
|
||||
fields[field_num].append(value)
|
||||
|
||||
except:
|
||||
break
|
||||
|
||||
return fields
|
||||
|
||||
def parse_nested_deep(self, data: bytes, path: str = "", depth: int = 0, max_depth: int = 5) -> List[tuple]:
|
||||
"""Recursively parse nested protobuf and return list of (path, type, value) tuples."""
|
||||
results = []
|
||||
pos = 0
|
||||
|
||||
if depth > max_depth:
|
||||
return results
|
||||
|
||||
while pos < len(data):
|
||||
if pos >= len(data):
|
||||
break
|
||||
try:
|
||||
# Read tag
|
||||
tag_byte = data[pos]
|
||||
pos += 1
|
||||
|
||||
# Handle multi-byte varints for tag
|
||||
tag = tag_byte & 0x7F
|
||||
shift = 7
|
||||
while tag_byte & 0x80 and pos < len(data):
|
||||
tag_byte = data[pos]
|
||||
pos += 1
|
||||
tag |= (tag_byte & 0x7F) << shift
|
||||
shift += 7
|
||||
|
||||
field_num = tag >> 3
|
||||
wire_type = tag & 0x07
|
||||
|
||||
if field_num == 0 or field_num > 100:
|
||||
break
|
||||
|
||||
field_path = f"{path}.{field_num}" if path else str(field_num)
|
||||
|
||||
if wire_type == WIRE_VARINT:
|
||||
# Read varint value
|
||||
val = 0
|
||||
shift = 0
|
||||
while pos < len(data):
|
||||
byte = data[pos]
|
||||
pos += 1
|
||||
val |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
results.append((field_path, 'varint', val))
|
||||
|
||||
elif wire_type == WIRE_FIXED32:
|
||||
raw = data[pos:pos + 4]
|
||||
pos += 4
|
||||
try:
|
||||
f = struct.unpack('<f', raw)[0]
|
||||
if f == f: # not NaN
|
||||
results.append((field_path, 'float', f))
|
||||
except:
|
||||
pass
|
||||
|
||||
elif wire_type == WIRE_FIXED64:
|
||||
raw = data[pos:pos + 8]
|
||||
pos += 8
|
||||
try:
|
||||
d = struct.unpack('<d', raw)[0]
|
||||
if d == d: # not NaN
|
||||
results.append((field_path, 'double', d))
|
||||
except:
|
||||
pass
|
||||
|
||||
elif wire_type == WIRE_LENGTH:
|
||||
# Read length
|
||||
length = 0
|
||||
shift = 0
|
||||
while pos < len(data):
|
||||
byte = data[pos]
|
||||
pos += 1
|
||||
length |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
raw = data[pos:pos + length]
|
||||
pos += length
|
||||
|
||||
# Try to parse as nested message
|
||||
if len(raw) >= 2:
|
||||
nested_results = self.parse_nested_deep(raw, field_path, depth + 1, max_depth)
|
||||
if nested_results:
|
||||
results.extend(nested_results)
|
||||
else:
|
||||
# Couldn't parse as nested, store as bytes
|
||||
results.append((field_path, 'bytes', raw.hex()[:40]))
|
||||
else:
|
||||
results.append((field_path, 'bytes', raw.hex()))
|
||||
|
||||
else:
|
||||
break
|
||||
|
||||
except Exception as e:
|
||||
break
|
||||
|
||||
return results
|
||||
|
||||
def parse_nested_entry(self, data: bytes) -> dict:
|
||||
"""Parse a nested protobuf entry and return all fields (single level)."""
|
||||
entry = {'fields': {}}
|
||||
pos = 0
|
||||
|
||||
while pos < len(data):
|
||||
if pos >= len(data):
|
||||
break
|
||||
try:
|
||||
# Read tag
|
||||
tag_byte = data[pos]
|
||||
pos += 1
|
||||
|
||||
# Handle multi-byte varints for tag
|
||||
tag = tag_byte & 0x7F
|
||||
shift = 7
|
||||
while tag_byte & 0x80 and pos < len(data):
|
||||
tag_byte = data[pos]
|
||||
pos += 1
|
||||
tag |= (tag_byte & 0x7F) << shift
|
||||
shift += 7
|
||||
|
||||
field_num = tag >> 3
|
||||
wire_type = tag & 0x07
|
||||
|
||||
if field_num == 0 or field_num > 100:
|
||||
break
|
||||
|
||||
if wire_type == WIRE_VARINT:
|
||||
# Read varint value
|
||||
val = 0
|
||||
shift = 0
|
||||
while pos < len(data):
|
||||
byte = data[pos]
|
||||
pos += 1
|
||||
val |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
entry['fields'][field_num] = ('varint', val)
|
||||
|
||||
elif wire_type == WIRE_FIXED32:
|
||||
raw = data[pos:pos + 4]
|
||||
pos += 4
|
||||
try:
|
||||
f = struct.unpack('<f', raw)[0]
|
||||
entry['fields'][field_num] = ('float', f, raw.hex())
|
||||
except:
|
||||
entry['fields'][field_num] = ('fixed32', raw.hex())
|
||||
|
||||
elif wire_type == WIRE_FIXED64:
|
||||
raw = data[pos:pos + 8]
|
||||
pos += 8
|
||||
try:
|
||||
d = struct.unpack('<d', raw)[0]
|
||||
entry['fields'][field_num] = ('double', d, raw.hex())
|
||||
except:
|
||||
entry['fields'][field_num] = ('fixed64', raw.hex())
|
||||
|
||||
elif wire_type == WIRE_LENGTH:
|
||||
# Read length
|
||||
length = 0
|
||||
shift = 0
|
||||
while pos < len(data):
|
||||
byte = data[pos]
|
||||
pos += 1
|
||||
length |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
raw = data[pos:pos + length]
|
||||
pos += length
|
||||
entry['fields'][field_num] = ('bytes', len(raw), raw.hex()[:40], raw)
|
||||
|
||||
else:
|
||||
break
|
||||
|
||||
except Exception as e:
|
||||
entry['parse_error'] = str(e)
|
||||
break
|
||||
|
||||
return entry
|
||||
|
||||
|
||||
def is_voltage_like(val: float) -> Optional[str]:
|
||||
"""Check if a float value looks like a battery voltage."""
|
||||
if 10.0 <= val <= 16.0:
|
||||
return "12V system"
|
||||
if 20.0 <= val <= 32.0:
|
||||
return "24V system"
|
||||
if 40.0 <= val <= 60.0:
|
||||
return "48V system"
|
||||
return None
|
||||
|
||||
|
||||
def print_nested_field(raw_data: bytes, field_num: int, indent: str = " "):
|
||||
"""Print a nested field with deep parsing."""
|
||||
parser = ProtobufParser(raw_data)
|
||||
deep_results = parser.parse_nested_deep(raw_data, str(field_num))
|
||||
|
||||
# Group results by depth for better display
|
||||
print(f"{indent}Deep parse of Field {field_num} (length={len(raw_data)}):")
|
||||
|
||||
for path, vtype, value in deep_results:
|
||||
depth = path.count('.')
|
||||
sub_indent = indent + " " * depth
|
||||
|
||||
if vtype == 'float':
|
||||
voltage_hint = is_voltage_like(value)
|
||||
if voltage_hint:
|
||||
print(f"{sub_indent}Field {path}: {value:.2f} ({vtype}) <- VOLTAGE? ({voltage_hint})")
|
||||
elif value != value: # NaN
|
||||
print(f"{sub_indent}Field {path}: nan ({vtype})")
|
||||
else:
|
||||
print(f"{sub_indent}Field {path}: {value:.4f} ({vtype})")
|
||||
elif vtype == 'double':
|
||||
voltage_hint = is_voltage_like(value)
|
||||
if voltage_hint:
|
||||
print(f"{sub_indent}Field {path}: {value:.2f} ({vtype}) <- VOLTAGE? ({voltage_hint})")
|
||||
else:
|
||||
print(f"{sub_indent}Field {path}: {value:.4f} ({vtype})")
|
||||
elif vtype == 'varint':
|
||||
print(f"{sub_indent}Field {path}: {value} ({vtype})")
|
||||
else:
|
||||
print(f"{sub_indent}Field {path}: {value} ({vtype})")
|
||||
|
||||
|
||||
def scan_packet(data: bytes, group: str, port: int, target_field: Optional[int] = None):
|
||||
"""Scan a packet and dump potential battery-related fields."""
|
||||
if len(data) < HEADER_SIZE + 5:
|
||||
return
|
||||
|
||||
proto_data = data[HEADER_SIZE:]
|
||||
parser = ProtobufParser(proto_data)
|
||||
all_fields = parser.parse_all_fields()
|
||||
|
||||
print(f"\n{'='*70}")
|
||||
print(f"Packet from {group}:{port} (size: {len(data)} bytes)")
|
||||
print(f"Top-level fields present: {sorted(all_fields.keys())}")
|
||||
print(f"{'='*70}")
|
||||
|
||||
# If a specific field is targeted, only show that one
|
||||
fields_to_check = {target_field} if target_field else BATTERY_CANDIDATE_FIELDS
|
||||
|
||||
# Check candidate fields for battery data
|
||||
for field_num in sorted(fields_to_check):
|
||||
if field_num in all_fields:
|
||||
entries = all_fields[field_num]
|
||||
print(f"\n Field {field_num}: {len(entries)} entries")
|
||||
|
||||
for i, entry in enumerate(entries):
|
||||
if entry[0] == 'length':
|
||||
raw_data = entry[1]
|
||||
print(f"\n Entry {i+1}:")
|
||||
print_nested_field(raw_data, field_num, " ")
|
||||
|
||||
elif entry[0] in ('float', 'double'):
|
||||
voltage_hint = is_voltage_like(entry[1])
|
||||
if voltage_hint:
|
||||
print(f" Value: {entry[1]:.2f} <- VOLTAGE? ({voltage_hint})")
|
||||
else:
|
||||
print(f" Value: {entry[1]:.4f}")
|
||||
else:
|
||||
print(f" Value: {entry}")
|
||||
|
||||
# Deep scan ALL fields for voltage-like values
|
||||
print(f"\n Deep scanning all fields for voltage-like values:")
|
||||
found_any = False
|
||||
for field_num, entries in sorted(all_fields.items()):
|
||||
for entry in entries:
|
||||
if entry[0] == 'length':
|
||||
raw_data = entry[1]
|
||||
deep_parser = ProtobufParser(raw_data)
|
||||
deep_results = deep_parser.parse_nested_deep(raw_data, str(field_num))
|
||||
for path, vtype, value in deep_results:
|
||||
if vtype in ('float', 'double') and is_voltage_like(value):
|
||||
print(f" Field {path}: {value:.2f}V ({is_voltage_like(value)})")
|
||||
found_any = True
|
||||
elif entry[0] == 'float' and is_voltage_like(entry[1]):
|
||||
print(f" Field {field_num}: {entry[1]:.2f}V ({is_voltage_like(entry[1])})")
|
||||
found_any = True
|
||||
elif entry[0] == 'double' and is_voltage_like(entry[1]):
|
||||
print(f" Field {field_num}: {entry[1]:.2f}V ({is_voltage_like(entry[1])})")
|
||||
found_any = True
|
||||
|
||||
if not found_any:
|
||||
print(f" (no voltage-like values found)")
|
||||
|
||||
|
||||
def main():
|
||||
import argparse
|
||||
parser = argparse.ArgumentParser(description="Debug potential battery data fields")
|
||||
parser.add_argument('-i', '--interface', required=True, help='Interface IP')
|
||||
parser.add_argument('-t', '--time', type=int, default=5, help='Capture time (seconds)')
|
||||
parser.add_argument('-g', '--group', default="226.192.206.102", help='Multicast group')
|
||||
parser.add_argument('-p', '--port', type=int, default=2565, help='Port')
|
||||
parser.add_argument('-f', '--field', type=int, help='Focus on specific field number (e.g., 14)')
|
||||
args = parser.parse_args()
|
||||
|
||||
print(f"Scanning for battery data fields...")
|
||||
if args.field:
|
||||
print(f"Focusing on Field {args.field} with deep nesting")
|
||||
else:
|
||||
print(f"Looking at fields: {sorted(BATTERY_CANDIDATE_FIELDS)}")
|
||||
print(f"Target voltages: Aft House ~26.3V, Stern House ~27.2V, Port Engine ~26.5V")
|
||||
print(f"\nCapturing from {args.group}:{args.port} for {args.time} seconds...")
|
||||
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
if hasattr(socket, 'SO_REUSEPORT'):
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
|
||||
sock.bind(('', args.port))
|
||||
mreq = struct.pack("4s4s", socket.inet_aton(args.group), socket.inet_aton(args.interface))
|
||||
sock.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, mreq)
|
||||
sock.settimeout(1.0)
|
||||
|
||||
seen_sizes = set()
|
||||
end_time = time.time() + args.time
|
||||
|
||||
try:
|
||||
while time.time() < end_time:
|
||||
try:
|
||||
data, _ = sock.recvfrom(65535)
|
||||
# Only process each unique packet size once
|
||||
if len(data) not in seen_sizes:
|
||||
seen_sizes.add(len(data))
|
||||
scan_packet(data, args.group, args.port, args.field)
|
||||
except socket.timeout:
|
||||
continue
|
||||
except KeyboardInterrupt:
|
||||
pass
|
||||
finally:
|
||||
sock.close()
|
||||
|
||||
print("\n\nDone.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
235
axiom-nmea/debug/battery_finder.py
Normal file
235
axiom-nmea/debug/battery_finder.py
Normal file
@@ -0,0 +1,235 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Battery Finder - Scan all multicast groups for values matching expected battery voltages.
|
||||
|
||||
Searches for house battery voltages (26.3V and 27.2V) across all protobuf fields.
|
||||
"""
|
||||
|
||||
import struct
|
||||
import socket
|
||||
import time
|
||||
import threading
|
||||
from typing import Dict, Any, Optional, List, Tuple
|
||||
|
||||
WIRE_VARINT = 0
|
||||
WIRE_FIXED64 = 1
|
||||
WIRE_LENGTH = 2
|
||||
WIRE_FIXED32 = 5
|
||||
|
||||
HEADER_SIZE = 20
|
||||
|
||||
MULTICAST_GROUPS = [
|
||||
("226.192.206.98", 2561),
|
||||
("226.192.206.99", 2562),
|
||||
("226.192.206.100", 2563),
|
||||
("226.192.206.101", 2564),
|
||||
("226.192.206.102", 2565),
|
||||
("226.192.219.0", 3221),
|
||||
("239.2.1.1", 2154),
|
||||
]
|
||||
|
||||
# Target voltage values to find (with tolerance)
|
||||
# Aft House: 26.3V, Stern House: 27.2V
|
||||
TARGET_VOLTAGES = [
|
||||
(26.0, 26.6), # Aft house battery ~26.3V
|
||||
(26.9, 27.5), # Stern house battery ~27.2V
|
||||
(12.0, 15.0), # 12V battery range (in case they're 12V systems)
|
||||
(24.0, 29.0), # General 24V battery range
|
||||
]
|
||||
|
||||
|
||||
class ProtobufParser:
|
||||
def __init__(self, data: bytes):
|
||||
self.data = data
|
||||
self.pos = 0
|
||||
|
||||
def read_varint(self) -> int:
|
||||
result = 0
|
||||
shift = 0
|
||||
while self.pos < len(self.data):
|
||||
byte = self.data[self.pos]
|
||||
self.pos += 1
|
||||
result |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
return result
|
||||
|
||||
def parse(self, path: str = "") -> List[Tuple[str, str, Any]]:
|
||||
"""Parse and return list of (path, type, value) for all fields."""
|
||||
results = []
|
||||
while self.pos < len(self.data):
|
||||
try:
|
||||
tag = self.read_varint()
|
||||
field_num = tag >> 3
|
||||
wire_type = tag & 0x07
|
||||
|
||||
if field_num == 0 or field_num > 1000:
|
||||
break
|
||||
|
||||
field_path = f"{path}.{field_num}" if path else str(field_num)
|
||||
|
||||
if wire_type == WIRE_VARINT:
|
||||
value = self.read_varint()
|
||||
results.append((field_path, "varint", value))
|
||||
elif wire_type == WIRE_FIXED64:
|
||||
raw = self.data[self.pos:self.pos + 8]
|
||||
self.pos += 8
|
||||
try:
|
||||
d = struct.unpack('<d', raw)[0]
|
||||
if d == d: # not NaN
|
||||
results.append((field_path, "double", d))
|
||||
except:
|
||||
pass
|
||||
elif wire_type == WIRE_LENGTH:
|
||||
length = self.read_varint()
|
||||
raw = self.data[self.pos:self.pos + length]
|
||||
self.pos += length
|
||||
# Try to parse as nested
|
||||
try:
|
||||
nested = ProtobufParser(raw)
|
||||
nested_results = nested.parse(field_path)
|
||||
if nested_results:
|
||||
results.extend(nested_results)
|
||||
except:
|
||||
pass
|
||||
elif wire_type == WIRE_FIXED32:
|
||||
raw = self.data[self.pos:self.pos + 4]
|
||||
self.pos += 4
|
||||
try:
|
||||
f = struct.unpack('<f', raw)[0]
|
||||
if f == f: # not NaN
|
||||
results.append((field_path, "float", f))
|
||||
except:
|
||||
pass
|
||||
else:
|
||||
break
|
||||
except:
|
||||
break
|
||||
return results
|
||||
|
||||
|
||||
def is_voltage_value(val: float) -> Optional[str]:
|
||||
"""Check if value matches expected battery voltage ranges."""
|
||||
if 26.0 <= val <= 26.6:
|
||||
return "Aft House (~26.3V)"
|
||||
if 26.9 <= val <= 27.5:
|
||||
return "Stern House (~27.2V)"
|
||||
if 12.0 <= val <= 15.0:
|
||||
return "12V battery range"
|
||||
if 24.0 <= val <= 29.0:
|
||||
return "24V battery range"
|
||||
return None
|
||||
|
||||
|
||||
def scan_packet(data: bytes, group: str, port: int, verbose: bool = False):
|
||||
"""Scan a packet for voltage values."""
|
||||
if len(data) < HEADER_SIZE + 5:
|
||||
return
|
||||
|
||||
proto_data = data[HEADER_SIZE:]
|
||||
parser = ProtobufParser(proto_data)
|
||||
fields = parser.parse()
|
||||
|
||||
matches = []
|
||||
for path, vtype, value in fields:
|
||||
if isinstance(value, (int, float)):
|
||||
match_desc = is_voltage_value(value)
|
||||
if match_desc:
|
||||
matches.append((path, vtype, value, match_desc))
|
||||
|
||||
if matches:
|
||||
print(f"\n{'='*70}")
|
||||
print(f"VOLTAGE MATCH on {group}:{port} (packet size: {len(data)})")
|
||||
print(f"{'='*70}")
|
||||
for path, vtype, value, desc in matches:
|
||||
print(f" Field {path} ({vtype}): {value:.2f}V <- {desc}")
|
||||
|
||||
# Show context: look at parent field structure
|
||||
if verbose:
|
||||
print(f"\nAll numeric values in packet:")
|
||||
for path, vtype, value in fields:
|
||||
if isinstance(value, (int, float)) and value != 0:
|
||||
print(f" {path}: {value} ({vtype})")
|
||||
|
||||
|
||||
class MulticastScanner:
|
||||
def __init__(self, interface_ip: str, verbose: bool = False):
|
||||
self.interface_ip = interface_ip
|
||||
self.verbose = verbose
|
||||
self.running = False
|
||||
self.lock = threading.Lock()
|
||||
|
||||
def _create_socket(self, group: str, port: int):
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
if hasattr(socket, 'SO_REUSEPORT'):
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
|
||||
sock.bind(('', port))
|
||||
mreq = struct.pack("4s4s", socket.inet_aton(group), socket.inet_aton(self.interface_ip))
|
||||
sock.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, mreq)
|
||||
sock.settimeout(1.0)
|
||||
return sock
|
||||
|
||||
def _listen(self, sock, group: str, port: int):
|
||||
seen_sizes = set()
|
||||
while self.running:
|
||||
try:
|
||||
data, _ = sock.recvfrom(65535)
|
||||
# Only process each unique packet size once per group
|
||||
size_key = len(data)
|
||||
if size_key not in seen_sizes:
|
||||
seen_sizes.add(size_key)
|
||||
with self.lock:
|
||||
scan_packet(data, group, port, self.verbose)
|
||||
except socket.timeout:
|
||||
continue
|
||||
except:
|
||||
pass
|
||||
|
||||
def start(self):
|
||||
self.running = True
|
||||
threads = []
|
||||
for group, port in MULTICAST_GROUPS:
|
||||
try:
|
||||
sock = self._create_socket(group, port)
|
||||
t = threading.Thread(target=self._listen, args=(sock, group, port), daemon=True)
|
||||
t.start()
|
||||
threads.append(t)
|
||||
print(f"Scanning {group}:{port}")
|
||||
except Exception as e:
|
||||
print(f"Error on {group}:{port}: {e}")
|
||||
return threads
|
||||
|
||||
def stop(self):
|
||||
self.running = False
|
||||
|
||||
|
||||
def main():
|
||||
import argparse
|
||||
parser = argparse.ArgumentParser(description="Find battery voltage values in multicast data")
|
||||
parser.add_argument('-i', '--interface', required=True, help='Interface IP')
|
||||
parser.add_argument('-t', '--time', type=int, default=10, help='Scan duration (seconds)')
|
||||
parser.add_argument('-v', '--verbose', action='store_true', help='Show all numeric values')
|
||||
args = parser.parse_args()
|
||||
|
||||
print(f"Scanning for battery voltages:")
|
||||
print(f" - Aft House: ~26.3V (range 26.0-26.6)")
|
||||
print(f" - Stern House: ~27.2V (range 26.9-27.5)")
|
||||
print(f" - Also checking 12V and 24V ranges")
|
||||
print(f"\nWill scan for {args.time} seconds\n")
|
||||
|
||||
scanner = MulticastScanner(args.interface, args.verbose)
|
||||
scanner.start()
|
||||
|
||||
try:
|
||||
time.sleep(args.time)
|
||||
except KeyboardInterrupt:
|
||||
pass
|
||||
finally:
|
||||
scanner.stop()
|
||||
print("\nDone scanning")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
79
axiom-nmea/debug/check_006b.py
Normal file
79
axiom-nmea/debug/check_006b.py
Normal file
@@ -0,0 +1,79 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Check offset 0x006b thoroughly across all packets."""
|
||||
|
||||
import struct
|
||||
from collections import Counter
|
||||
|
||||
def decode_float(data, offset):
|
||||
if offset + 4 > len(data):
|
||||
return None
|
||||
try:
|
||||
val = struct.unpack('<f', data[offset:offset+4])[0]
|
||||
if val != val: # NaN check
|
||||
return None
|
||||
return val
|
||||
except:
|
||||
return None
|
||||
|
||||
def read_pcap(filename):
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append(pkt_data[payload_start:])
|
||||
return packets
|
||||
|
||||
OFFSET = 0x006b
|
||||
|
||||
for pcap in ["raymarine_sample_twd_69-73.pcap", "raymarine_sample.pcap"]:
|
||||
print(f"\n{'='*60}")
|
||||
print(f"FILE: {pcap}")
|
||||
print(f"OFFSET: 0x{OFFSET:04x}")
|
||||
print("="*60)
|
||||
|
||||
packets = read_pcap(pcap)
|
||||
|
||||
# Group by packet size
|
||||
by_size = {}
|
||||
for pkt in packets:
|
||||
pkt_len = len(pkt)
|
||||
if pkt_len not in by_size:
|
||||
by_size[pkt_len] = []
|
||||
by_size[pkt_len].append(pkt)
|
||||
|
||||
for pkt_len in sorted(by_size.keys()):
|
||||
if pkt_len < 120: # Skip small packets
|
||||
continue
|
||||
|
||||
pkts = by_size[pkt_len]
|
||||
vals = []
|
||||
for pkt in pkts:
|
||||
v = decode_float(pkt, OFFSET)
|
||||
if v is not None:
|
||||
vals.append(v)
|
||||
|
||||
if vals:
|
||||
# Convert to degrees
|
||||
degs = [v * 57.2958 for v in vals if 0 <= v <= 6.5]
|
||||
if degs:
|
||||
avg = sum(degs) / len(degs)
|
||||
print(f" {pkt_len:5d} bytes ({len(pkts):3d} pkts): avg={avg:6.1f}°, range={min(degs):.1f}°-{max(degs):.1f}°")
|
||||
else:
|
||||
# Not radians - show raw
|
||||
print(f" {pkt_len:5d} bytes ({len(pkts):3d} pkts): raw values (not radians)")
|
||||
123
axiom-nmea/debug/compare_heading_both_pcaps.py
Normal file
123
axiom-nmea/debug/compare_heading_both_pcaps.py
Normal file
@@ -0,0 +1,123 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Compare heading candidates between both pcap files."""
|
||||
|
||||
import struct
|
||||
|
||||
def decode_float(data, offset):
|
||||
if offset + 4 > len(data):
|
||||
return None
|
||||
try:
|
||||
val = struct.unpack('<f', data[offset:offset+4])[0]
|
||||
if val != val:
|
||||
return None
|
||||
return val
|
||||
except:
|
||||
return None
|
||||
|
||||
def read_pcap(filename):
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append(pkt_data[payload_start:])
|
||||
return packets
|
||||
|
||||
# Heading candidate offsets by packet size (from TWD analysis)
|
||||
HEADING_OFFSETS = {
|
||||
344: 0x0144,
|
||||
446: 0x0189,
|
||||
788: 0x0081,
|
||||
888: 0x0081,
|
||||
931: 0x0081,
|
||||
1031: 0x0081,
|
||||
1472: 0x0088,
|
||||
}
|
||||
|
||||
# TWD offset (consistent across sizes)
|
||||
TWD_OFFSET = 0x006b
|
||||
|
||||
print("=" * 70)
|
||||
print("COMPARING TWD (0x006b) AND HEADING CANDIDATES ACROSS BOTH PCAPS")
|
||||
print("=" * 70)
|
||||
|
||||
for pcap_file in ["raymarine_sample.pcap", "raymarine_sample_twd_69-73.pcap"]:
|
||||
print(f"\n{'='*70}")
|
||||
print(f"FILE: {pcap_file}")
|
||||
print("=" * 70)
|
||||
|
||||
packets = read_pcap(pcap_file)
|
||||
|
||||
# Group by packet size
|
||||
by_size = {}
|
||||
for pkt in packets:
|
||||
pkt_len = len(pkt)
|
||||
if pkt_len not in by_size:
|
||||
by_size[pkt_len] = []
|
||||
by_size[pkt_len].append(pkt)
|
||||
|
||||
print(f"\n{'Pkt Size':<10} {'Count':<8} {'TWD (0x006b)':<18} {'Heading':<18} {'Diff':<10}")
|
||||
print("-" * 70)
|
||||
|
||||
for pkt_len in sorted(HEADING_OFFSETS.keys()):
|
||||
if pkt_len not in by_size:
|
||||
continue
|
||||
|
||||
pkts = by_size[pkt_len]
|
||||
heading_offset = HEADING_OFFSETS[pkt_len]
|
||||
|
||||
# Get all values
|
||||
twd_vals = []
|
||||
hdg_vals = []
|
||||
|
||||
for pkt in pkts:
|
||||
twd = decode_float(pkt, TWD_OFFSET)
|
||||
hdg = decode_float(pkt, heading_offset)
|
||||
|
||||
if twd and 0 <= twd <= 6.5:
|
||||
twd_vals.append(twd * 57.2958)
|
||||
if hdg and 0 <= hdg <= 6.5:
|
||||
hdg_vals.append(hdg * 57.2958)
|
||||
|
||||
if twd_vals and hdg_vals:
|
||||
twd_avg = sum(twd_vals) / len(twd_vals)
|
||||
hdg_avg = sum(hdg_vals) / len(hdg_vals)
|
||||
diff = abs(twd_avg - hdg_avg)
|
||||
|
||||
twd_str = f"{twd_avg:.1f}° ({min(twd_vals):.0f}-{max(twd_vals):.0f})"
|
||||
hdg_str = f"{hdg_avg:.1f}° ({min(hdg_vals):.0f}-{max(hdg_vals):.0f})"
|
||||
|
||||
print(f"{pkt_len:<10} {len(pkts):<8} {twd_str:<18} {hdg_str:<18} {diff:.1f}°")
|
||||
elif twd_vals:
|
||||
twd_avg = sum(twd_vals) / len(twd_vals)
|
||||
print(f"{pkt_len:<10} {len(pkts):<8} {twd_avg:.1f}° ---")
|
||||
else:
|
||||
print(f"{pkt_len:<10} {len(pkts):<8} --- ---")
|
||||
|
||||
print("\n" + "=" * 70)
|
||||
print("INTERPRETATION")
|
||||
print("=" * 70)
|
||||
print("""
|
||||
If heading and TWD are consistent across both captures:
|
||||
- The offsets are correct for those data types
|
||||
- Difference should be within ~50° (boat at anchor pointing into wind)
|
||||
|
||||
Expected relationships:
|
||||
- TWD capture: TWD ~69-73°, so heading should be ~20-120°
|
||||
- Original capture: Need to check what values make sense
|
||||
""")
|
||||
83
axiom-nmea/debug/compare_offsets.py
Normal file
83
axiom-nmea/debug/compare_offsets.py
Normal file
@@ -0,0 +1,83 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Compare old offsets vs new 0x006b offset for wind direction."""
|
||||
|
||||
import struct
|
||||
from collections import defaultdict
|
||||
|
||||
def decode_float(data, offset):
|
||||
if offset + 4 > len(data):
|
||||
return None
|
||||
try:
|
||||
val = struct.unpack('<f', data[offset:offset+4])[0]
|
||||
if val != val: # NaN check
|
||||
return None
|
||||
return val
|
||||
except:
|
||||
return None
|
||||
|
||||
def read_pcap(filename):
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append(pkt_data[payload_start:])
|
||||
return packets
|
||||
|
||||
# Current offsets from decoder
|
||||
OLD_OFFSETS = {
|
||||
344: 0x00a0,
|
||||
446: 0x00a7,
|
||||
788: 0x00c5,
|
||||
888: 0x00c5,
|
||||
931: 0x00c5,
|
||||
1031: 0x00c5,
|
||||
1472: 0x00fc,
|
||||
}
|
||||
|
||||
NEW_OFFSET = 0x006b
|
||||
|
||||
for pcap_file in ["raymarine_sample.pcap", "raymarine_sample_twd_69-73.pcap"]:
|
||||
print(f"\n{'='*70}")
|
||||
print(f"FILE: {pcap_file}")
|
||||
print("="*70)
|
||||
|
||||
packets = read_pcap(pcap_file)
|
||||
print(f"Loaded {len(packets)} packets\n")
|
||||
|
||||
print(f"{'Pkt Size':<10} {'Old Offset':<12} {'Old Value':<15} {'New (0x006b)':<15}")
|
||||
print("-" * 55)
|
||||
|
||||
for pkt_len in sorted(OLD_OFFSETS.keys()):
|
||||
old_offset = OLD_OFFSETS[pkt_len]
|
||||
|
||||
# Find packets of this size
|
||||
matching = [p for p in packets if len(p) == pkt_len]
|
||||
if not matching:
|
||||
continue
|
||||
|
||||
# Sample first packet of this size
|
||||
pkt = matching[0]
|
||||
|
||||
old_val = decode_float(pkt, old_offset)
|
||||
new_val = decode_float(pkt, NEW_OFFSET)
|
||||
|
||||
old_deg = f"{old_val * 57.2958:.1f}°" if old_val and 0 <= old_val <= 6.5 else "invalid"
|
||||
new_deg = f"{new_val * 57.2958:.1f}°" if new_val and 0 <= new_val <= 6.5 else "invalid"
|
||||
|
||||
print(f"{pkt_len:<10} 0x{old_offset:04x} {old_deg:<15} {new_deg:<15}")
|
||||
96
axiom-nmea/debug/debug_decode.py
Normal file
96
axiom-nmea/debug/debug_decode.py
Normal file
@@ -0,0 +1,96 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Debug the full decode process."""
|
||||
|
||||
import struct
|
||||
from collections import Counter
|
||||
|
||||
def decode_float(data, offset):
|
||||
if offset + 4 > len(data):
|
||||
return None
|
||||
try:
|
||||
return struct.unpack('<f', data[offset:offset+4])[0]
|
||||
except:
|
||||
return None
|
||||
|
||||
def read_pcap(filename):
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append(pkt_data[payload_start:])
|
||||
return packets
|
||||
|
||||
packets = read_pcap("raymarine_sample.pcap")
|
||||
|
||||
# Count packet sizes
|
||||
size_counts = Counter(len(p) for p in packets)
|
||||
print("Packet size distribution:")
|
||||
for size, count in sorted(size_counts.items(), key=lambda x: -x[1])[:15]:
|
||||
print(f" {size:5d} bytes: {count:4d} packets")
|
||||
|
||||
print("\n" + "="*60)
|
||||
print("Simulating wind extraction logic:")
|
||||
print("="*60)
|
||||
|
||||
last_speed = None
|
||||
last_dir = None
|
||||
extractions = 0
|
||||
|
||||
for pkt in packets:
|
||||
pkt_len = len(pkt)
|
||||
|
||||
# Same logic as decoder
|
||||
if 340 <= pkt_len <= 350:
|
||||
offset_pairs = [(0x00a5, 0x00a0), (0x00c3, 0x00a0), (0x00c8, 0x00a0)]
|
||||
elif 440 <= pkt_len <= 500:
|
||||
offset_pairs = [(0x00ac, 0x00a7), (0x00ca, 0x00a7), (0x00b1, 0x00a7)]
|
||||
elif 780 <= pkt_len <= 1100:
|
||||
offset_pairs = [(0x00ca, 0x00c5), (0x00e8, 0x0090), (0x00cf, 0x00c5)]
|
||||
elif 1400 <= pkt_len <= 1500:
|
||||
offset_pairs = [(0x0101, 0x00fc), (0x0106, 0x00fc), (0x011f, 0x00fc)]
|
||||
else:
|
||||
offset_pairs = [(0x00ca, 0x00c5), (0x00a5, 0x00a0)]
|
||||
|
||||
for speed_offset, dir_offset in offset_pairs:
|
||||
if speed_offset + 4 > pkt_len or dir_offset + 4 > pkt_len:
|
||||
continue
|
||||
|
||||
speed_val = decode_float(pkt, speed_offset)
|
||||
dir_val = decode_float(pkt, dir_offset)
|
||||
|
||||
if speed_val is None or dir_val is None:
|
||||
continue
|
||||
|
||||
# Validate
|
||||
if not (0 < speed_val < 60):
|
||||
continue
|
||||
if not (0 <= dir_val <= 6.5):
|
||||
continue
|
||||
|
||||
speed_kts = speed_val * 1.94384
|
||||
dir_deg = (dir_val * 57.2958) % 360
|
||||
|
||||
last_speed = speed_kts
|
||||
last_dir = dir_deg
|
||||
extractions += 1
|
||||
break
|
||||
|
||||
print(f"\nTotal successful extractions: {extractions}")
|
||||
print(f"Last wind speed: {last_speed:.1f} kts" if last_speed else "No speed")
|
||||
print(f"Last wind direction: {last_dir:.1f}°" if last_dir else "No direction")
|
||||
85
axiom-nmea/debug/debug_field13.py
Normal file
85
axiom-nmea/debug/debug_field13.py
Normal file
@@ -0,0 +1,85 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Debug Field 13 extraction across different packet sizes."""
|
||||
|
||||
import struct
|
||||
from protobuf_decoder import ProtobufParser, HEADER_SIZE, WIRE_FIXED32
|
||||
|
||||
def decode_float(raw):
|
||||
if len(raw) != 4:
|
||||
return None
|
||||
try:
|
||||
val = struct.unpack('<f', raw)[0]
|
||||
if val != val:
|
||||
return None
|
||||
return val
|
||||
except:
|
||||
return None
|
||||
|
||||
def read_pcap(filename):
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append(pkt_data[payload_start:])
|
||||
return packets
|
||||
|
||||
packets = read_pcap("raymarine_sample_TWD_62-70_HDG_29-35.pcap")
|
||||
|
||||
# Group by size
|
||||
by_size = {}
|
||||
for pkt in packets:
|
||||
size = len(pkt)
|
||||
if size not in by_size:
|
||||
by_size[size] = []
|
||||
by_size[size].append(pkt)
|
||||
|
||||
print("Analyzing Field 13 by packet size:")
|
||||
print("Expected: TWD 62-70°, HDG 29-35°")
|
||||
print("=" * 70)
|
||||
|
||||
for size in sorted(by_size.keys()):
|
||||
if size < 200:
|
||||
continue
|
||||
|
||||
pkts = by_size[size][:3] # First 3 of each size
|
||||
|
||||
print(f"\n{size} bytes ({len(by_size[size])} packets):")
|
||||
|
||||
for pkt in pkts:
|
||||
proto_data = pkt[HEADER_SIZE:]
|
||||
parser = ProtobufParser(proto_data)
|
||||
fields = parser.parse_message()
|
||||
|
||||
if 13 not in fields:
|
||||
print(" No Field 13")
|
||||
continue
|
||||
|
||||
f13 = fields[13]
|
||||
if not f13.children:
|
||||
print(" Field 13 has no children")
|
||||
continue
|
||||
|
||||
# Show all float fields in Field 13
|
||||
floats = []
|
||||
for fnum, child in sorted(f13.children.items()):
|
||||
if child.wire_type == WIRE_FIXED32:
|
||||
val = decode_float(child.value)
|
||||
if val is not None:
|
||||
deg = val * 57.2958
|
||||
floats.append(f"f{fnum}={deg:.1f}°")
|
||||
|
||||
print(f" {', '.join(floats)}")
|
||||
78
axiom-nmea/debug/debug_wind.py
Normal file
78
axiom-nmea/debug/debug_wind.py
Normal file
@@ -0,0 +1,78 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Debug wind extraction by examining actual packet bytes."""
|
||||
|
||||
import struct
|
||||
|
||||
PCAP_MAGIC = 0xa1b2c3d4
|
||||
|
||||
def decode_float(data, offset):
|
||||
if offset + 4 > len(data):
|
||||
return None
|
||||
try:
|
||||
return struct.unpack('<f', data[offset:offset+4])[0]
|
||||
except:
|
||||
return None
|
||||
|
||||
def read_pcap(filename):
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append(pkt_data[payload_start:])
|
||||
return packets
|
||||
|
||||
packets = read_pcap("raymarine_sample.pcap")
|
||||
|
||||
# Find packets of different sizes
|
||||
sizes_seen = {}
|
||||
for pkt in packets:
|
||||
plen = len(pkt)
|
||||
if plen not in sizes_seen:
|
||||
sizes_seen[plen] = pkt
|
||||
|
||||
print("Examining wind data at known offsets:\n")
|
||||
|
||||
for target_size in [344, 446, 788, 888, 1472]:
|
||||
# Find a packet close to this size
|
||||
for plen, pkt in sizes_seen.items():
|
||||
if abs(plen - target_size) <= 10:
|
||||
print(f"=== Packet size {plen} ===")
|
||||
|
||||
# Show bytes around expected wind offsets
|
||||
if 340 <= plen <= 350:
|
||||
offsets = [0x00a0, 0x00a5, 0x00c3, 0x00c8]
|
||||
elif 440 <= plen <= 500:
|
||||
offsets = [0x00a7, 0x00ac, 0x00ca, 0x00b1]
|
||||
elif 780 <= plen <= 900:
|
||||
offsets = [0x0090, 0x00c5, 0x00ca, 0x00cf]
|
||||
elif 1400 <= plen <= 1500:
|
||||
offsets = [0x00fc, 0x0101, 0x0106]
|
||||
else:
|
||||
offsets = []
|
||||
|
||||
for off in offsets:
|
||||
if off + 8 <= plen:
|
||||
hex_bytes = ' '.join(f'{b:02x}' for b in pkt[off:off+8])
|
||||
float_val = decode_float(pkt, off)
|
||||
# Also try treating as m/s and rad
|
||||
speed_kts = float_val * 1.94384 if float_val else 0
|
||||
dir_deg = float_val * 57.2958 if float_val else 0
|
||||
print(f" 0x{off:04x}: {hex_bytes} -> float={float_val:.4f} ({speed_kts:.1f} kts or {dir_deg:.1f}°)")
|
||||
print()
|
||||
break
|
||||
561
axiom-nmea/debug/dock_finder.py
Executable file
561
axiom-nmea/debug/dock_finder.py
Executable file
@@ -0,0 +1,561 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Dock Finder - Find SOG and COG fields while stationary at dock.
|
||||
|
||||
When at dock:
|
||||
- SOG bounces between 0.0 and 0.2 kts (0 to ~0.1 m/s)
|
||||
- COG jumps wildly between 0 and 359 degrees (0 to ~6.28 radians)
|
||||
|
||||
This script looks for paired fields that show these patterns:
|
||||
1. Speed: small positive values near zero (0-0.1 m/s → 0-0.2 kts)
|
||||
2. Angle: values spanning nearly the full 0-2π range (radians)
|
||||
|
||||
The script tracks variance over time to identify fluctuating fields.
|
||||
|
||||
Usage:
|
||||
python dock_finder.py -i 198.18.5.5
|
||||
python dock_finder.py -i 198.18.5.5 --samples 20 --interval 0.5
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import math
|
||||
import os
|
||||
import signal
|
||||
import socket
|
||||
import struct
|
||||
import sys
|
||||
import time
|
||||
from collections import defaultdict
|
||||
from copy import copy
|
||||
from dataclasses import dataclass
|
||||
from datetime import datetime
|
||||
from typing import Any, Dict, List, Optional, Set, Tuple
|
||||
|
||||
# Add parent directory to path for library import
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from raymarine_nmea.protocol.parser import ProtobufParser, ProtoField
|
||||
from raymarine_nmea.protocol.constants import (
|
||||
WIRE_VARINT, WIRE_FIXED64, WIRE_LENGTH, WIRE_FIXED32,
|
||||
HEADER_SIZE, RAD_TO_DEG, MS_TO_KTS,
|
||||
)
|
||||
from raymarine_nmea.sensors import MULTICAST_GROUPS
|
||||
|
||||
running = True
|
||||
|
||||
|
||||
def signal_handler(signum, frame):
|
||||
global running
|
||||
running = False
|
||||
|
||||
|
||||
@dataclass
|
||||
class FieldStats:
|
||||
"""Statistics for a field across multiple samples."""
|
||||
path: str
|
||||
wire_type: str
|
||||
values: List[float]
|
||||
|
||||
@property
|
||||
def count(self) -> int:
|
||||
return len(self.values)
|
||||
|
||||
@property
|
||||
def min_val(self) -> float:
|
||||
return min(self.values) if self.values else 0
|
||||
|
||||
@property
|
||||
def max_val(self) -> float:
|
||||
return max(self.values) if self.values else 0
|
||||
|
||||
@property
|
||||
def range_val(self) -> float:
|
||||
return self.max_val - self.min_val
|
||||
|
||||
@property
|
||||
def mean(self) -> float:
|
||||
return sum(self.values) / len(self.values) if self.values else 0
|
||||
|
||||
@property
|
||||
def variance(self) -> float:
|
||||
if len(self.values) < 2:
|
||||
return 0
|
||||
mean = self.mean
|
||||
return sum((v - mean) ** 2 for v in self.values) / len(self.values)
|
||||
|
||||
@property
|
||||
def std_dev(self) -> float:
|
||||
return math.sqrt(self.variance)
|
||||
|
||||
def is_sog_candidate(self) -> bool:
|
||||
"""Check if this could be SOG at dock (0-0.1 m/s, some variance)."""
|
||||
# Must be small positive values in m/s range
|
||||
if self.min_val < -0.01: # Allow tiny negative noise
|
||||
return False
|
||||
if self.max_val > 0.2: # Max 0.2 m/s ≈ 0.4 kts
|
||||
return False
|
||||
if self.max_val < 0.001: # Must have some value
|
||||
return False
|
||||
# Should have some variance (dock bouncing)
|
||||
if self.range_val < 0.001:
|
||||
return False
|
||||
return True
|
||||
|
||||
def is_cog_candidate(self) -> bool:
|
||||
"""Check if this could be COG at dock (full circle jumps in radians)."""
|
||||
# Must be in valid radian range (0 to 2π ≈ 6.28)
|
||||
if self.min_val < -0.1:
|
||||
return False
|
||||
if self.max_val > 7.0: # Allow slightly over 2π
|
||||
return False
|
||||
# At dock, COG jumps wildly - expect large range
|
||||
# Range should span significant portion of circle (at least 90°)
|
||||
min_range_rad = math.pi / 2 # 90 degrees in radians
|
||||
if self.range_val < min_range_rad:
|
||||
return False
|
||||
# Variance should be high
|
||||
if self.std_dev < 0.5: # Radians
|
||||
return False
|
||||
return True
|
||||
|
||||
def as_degrees(self) -> Tuple[float, float, float]:
|
||||
"""Return min, max, mean as degrees."""
|
||||
return (
|
||||
(self.min_val * RAD_TO_DEG) % 360,
|
||||
(self.max_val * RAD_TO_DEG) % 360,
|
||||
(self.mean * RAD_TO_DEG) % 360
|
||||
)
|
||||
|
||||
def as_knots(self) -> Tuple[float, float, float]:
|
||||
"""Return min, max, mean as knots."""
|
||||
return (
|
||||
self.min_val * MS_TO_KTS,
|
||||
self.max_val * MS_TO_KTS,
|
||||
self.mean * MS_TO_KTS
|
||||
)
|
||||
|
||||
|
||||
def decode_float(raw: bytes) -> Optional[float]:
|
||||
"""Decode 4 bytes as little-endian float."""
|
||||
if len(raw) == 4:
|
||||
try:
|
||||
val = struct.unpack('<f', raw)[0]
|
||||
if val == val: # NaN check
|
||||
return val
|
||||
except struct.error:
|
||||
pass
|
||||
return None
|
||||
|
||||
|
||||
def decode_double(raw: bytes) -> Optional[float]:
|
||||
"""Decode 8 bytes as little-endian double."""
|
||||
if len(raw) == 8:
|
||||
try:
|
||||
val = struct.unpack('<d', raw)[0]
|
||||
if val == val: # NaN check
|
||||
return val
|
||||
except struct.error:
|
||||
pass
|
||||
return None
|
||||
|
||||
|
||||
def scan_fields(pf: ProtoField, path: str, results: Dict[str, Tuple[str, float]]):
|
||||
"""Recursively scan fields and collect numeric values with their wire types."""
|
||||
|
||||
if pf.wire_type == WIRE_FIXED32:
|
||||
val = decode_float(pf.value)
|
||||
if val is not None:
|
||||
results[path] = ('f32', val)
|
||||
|
||||
elif pf.wire_type == WIRE_FIXED64:
|
||||
val = decode_double(pf.value)
|
||||
if val is not None:
|
||||
results[path] = ('f64', val)
|
||||
|
||||
elif pf.wire_type == WIRE_VARINT:
|
||||
# Skip varints - unlikely to be COG/SOG
|
||||
pass
|
||||
|
||||
# Recurse into children
|
||||
if pf.children:
|
||||
for child_num, child in pf.children.items():
|
||||
scan_fields(child, f"{path}.{child_num}", results)
|
||||
|
||||
|
||||
def scan_packet(packet: bytes) -> Dict[str, Tuple[str, float]]:
|
||||
"""Scan a packet and return all numeric fields."""
|
||||
results = {}
|
||||
if len(packet) < HEADER_SIZE + 10:
|
||||
return results
|
||||
|
||||
proto_data = packet[HEADER_SIZE:]
|
||||
parser = ProtobufParser(proto_data)
|
||||
fields = parser.parse_message(collect_repeated={14, 16, 20})
|
||||
|
||||
for field_num, val in fields.items():
|
||||
if isinstance(val, list):
|
||||
for i, pf in enumerate(val):
|
||||
scan_fields(pf, f"{field_num}[{i}]", results)
|
||||
else:
|
||||
scan_fields(val, f"{field_num}", results)
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def find_parent_group(path: str) -> str:
|
||||
"""Extract parent field group from path (e.g., '3.1' -> '3')."""
|
||||
parts = path.split('.')
|
||||
return parts[0] if parts else path
|
||||
|
||||
|
||||
def main():
|
||||
global running
|
||||
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Find SOG/COG fields while at dock",
|
||||
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||
epilog="""
|
||||
Expected patterns at dock:
|
||||
SOG: ~0.0-0.2 kts (0-0.1 m/s) with small fluctuations
|
||||
COG: Wildly jumping 0-359° (0-6.28 rad) due to GPS noise at low speed
|
||||
|
||||
The script will identify fields matching these patterns and group them.
|
||||
"""
|
||||
)
|
||||
parser.add_argument('-i', '--interface', required=True,
|
||||
help='Interface IP for Raymarine multicast (e.g., 198.18.5.5)')
|
||||
parser.add_argument('-n', '--samples', type=int, default=30,
|
||||
help='Number of samples to collect (default: 30)')
|
||||
parser.add_argument('--interval', type=float, default=0.5,
|
||||
help='Seconds between samples (default: 0.5)')
|
||||
parser.add_argument('--sog-max', type=float, default=0.2,
|
||||
help='Max expected SOG in knots at dock (default: 0.2)')
|
||||
parser.add_argument('--cog-range', type=float, default=90,
|
||||
help='Min expected COG range in degrees (default: 90)')
|
||||
parser.add_argument('--verbose', '-v', action='store_true',
|
||||
help='Show all fields, not just candidates')
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
signal.signal(signal.SIGINT, signal_handler)
|
||||
signal.signal(signal.SIGTERM, signal_handler)
|
||||
|
||||
# Create sockets
|
||||
sockets = []
|
||||
for group, port in MULTICAST_GROUPS:
|
||||
try:
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
if hasattr(socket, 'SO_REUSEPORT'):
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
|
||||
sock.bind(('', port))
|
||||
mreq = struct.pack("4s4s", socket.inet_aton(group), socket.inet_aton(args.interface))
|
||||
sock.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, mreq)
|
||||
sock.setblocking(False)
|
||||
sockets.append((sock, group, port))
|
||||
except Exception as e:
|
||||
print(f"Warning: Could not join {group}:{port}: {e}")
|
||||
|
||||
if not sockets:
|
||||
print("Error: Could not join any multicast groups")
|
||||
sys.exit(1)
|
||||
|
||||
print("=" * 70)
|
||||
print("DOCK FINDER - SOG/COG Field Discovery")
|
||||
print("=" * 70)
|
||||
print(f"Joined {len(sockets)} multicast groups:")
|
||||
for sock, group, port in sockets:
|
||||
print(f" - {group}:{port}")
|
||||
print()
|
||||
print(f"Looking for fields at dock:")
|
||||
print(f" SOG: 0.0 - {args.sog_max:.1f} kts (fluctuating near zero)")
|
||||
print(f" COG: Jumping with range >= {args.cog_range}° (GPS noise at low speed)")
|
||||
print(f"Collecting {args.samples} samples at {args.interval}s intervals...")
|
||||
print("-" * 70)
|
||||
|
||||
# Collect samples with diagnostics - track by packet size
|
||||
field_data: Dict[str, FieldStats] = {}
|
||||
field_data_by_size: Dict[int, Dict[str, FieldStats]] = defaultdict(dict)
|
||||
samples_collected = 0
|
||||
last_sample_time_by_size: Dict[int, float] = defaultdict(float)
|
||||
|
||||
# Diagnostic counters
|
||||
packets_by_group: Dict[str, int] = defaultdict(int)
|
||||
packets_by_size: Dict[int, int] = defaultdict(int)
|
||||
empty_parse_count = 0
|
||||
total_packets = 0
|
||||
|
||||
try:
|
||||
while running and samples_collected < args.samples:
|
||||
for sock, group, port in sockets:
|
||||
try:
|
||||
data, addr = sock.recvfrom(65535)
|
||||
pkt_size = len(data)
|
||||
total_packets += 1
|
||||
packets_by_group[f"{group}:{port}"] += 1
|
||||
packets_by_size[pkt_size] += 1
|
||||
|
||||
now = time.time()
|
||||
# Rate limit per packet size, not globally
|
||||
if (now - last_sample_time_by_size[pkt_size]) < args.interval:
|
||||
continue
|
||||
|
||||
results = scan_packet(data)
|
||||
if not results:
|
||||
empty_parse_count += 1
|
||||
continue
|
||||
|
||||
samples_collected += 1
|
||||
last_sample_time_by_size[pkt_size] = now
|
||||
|
||||
# Update statistics (global and per-size)
|
||||
for path, (wire_type, value) in results.items():
|
||||
# Global stats
|
||||
if path not in field_data:
|
||||
field_data[path] = FieldStats(path, wire_type, [])
|
||||
field_data[path].values.append(value)
|
||||
|
||||
# Per-size stats
|
||||
size_fields = field_data_by_size[pkt_size]
|
||||
if path not in size_fields:
|
||||
size_fields[path] = FieldStats(path, wire_type, [])
|
||||
size_fields[path].values.append(value)
|
||||
|
||||
# Progress indicator
|
||||
pct = min(100, (samples_collected / args.samples) * 100)
|
||||
print(f"\r Collecting: {samples_collected}/{args.samples} ({pct:.0f}%) [pkts: {total_packets}]", end='', flush=True)
|
||||
|
||||
except BlockingIOError:
|
||||
continue
|
||||
|
||||
time.sleep(0.01)
|
||||
|
||||
finally:
|
||||
for sock, _, _ in sockets:
|
||||
sock.close()
|
||||
|
||||
print() # Newline after progress
|
||||
|
||||
# Show packet diagnostics
|
||||
print()
|
||||
print("*** PACKET DIAGNOSTICS ***")
|
||||
print("-" * 70)
|
||||
print(f" Total packets received: {total_packets}")
|
||||
print(f" Packets with no parseable fields: {empty_parse_count}")
|
||||
print()
|
||||
print(" Packets by multicast group:")
|
||||
for grp, cnt in sorted(packets_by_group.items(), key=lambda x: -x[1]):
|
||||
print(f" {grp}: {cnt}")
|
||||
print()
|
||||
print(" Packets by size (top 10):")
|
||||
for size, cnt in sorted(packets_by_size.items(), key=lambda x: -x[1])[:10]:
|
||||
print(f" {size} bytes: {cnt}")
|
||||
|
||||
# Show fields by packet size
|
||||
print()
|
||||
print("*** FIELDS BY PACKET SIZE ***")
|
||||
print("-" * 70)
|
||||
for pkt_size in sorted(field_data_by_size.keys(), reverse=True):
|
||||
size_fields = field_data_by_size[pkt_size]
|
||||
if not size_fields:
|
||||
continue
|
||||
field_paths = sorted(size_fields.keys())
|
||||
sample_count = max(s.count for s in size_fields.values()) if size_fields else 0
|
||||
print(f"\n {pkt_size} bytes ({sample_count} samples, {len(field_paths)} fields):")
|
||||
for path in field_paths[:20]: # Show first 20 fields
|
||||
stats = size_fields[path]
|
||||
# Show with interpretation
|
||||
interp = ""
|
||||
if 0 <= stats.min_val and stats.max_val <= 7:
|
||||
min_deg = (stats.min_val * RAD_TO_DEG) % 360
|
||||
max_deg = (stats.max_val * RAD_TO_DEG) % 360
|
||||
interp = f" | {min_deg:.1f}°-{max_deg:.1f}°"
|
||||
elif 0 <= stats.min_val and stats.max_val <= 50:
|
||||
min_kts = stats.min_val * MS_TO_KTS
|
||||
max_kts = stats.max_val * MS_TO_KTS
|
||||
interp = f" | {min_kts:.2f}-{max_kts:.2f} kts"
|
||||
print(f" {path:<15} {stats.min_val:>10.4f} - {stats.max_val:>10.4f} (range: {stats.range_val:.4f}){interp}")
|
||||
if len(field_paths) > 20:
|
||||
print(f" ... and {len(field_paths) - 20} more fields")
|
||||
|
||||
if samples_collected < 5:
|
||||
print("\nError: Not enough samples collected. Check your network connection.")
|
||||
sys.exit(1)
|
||||
|
||||
# Analyze results - use per-packet-size data for better detection
|
||||
print()
|
||||
print("=" * 70)
|
||||
print(f"ANALYSIS RESULTS ({samples_collected} samples)")
|
||||
print("=" * 70)
|
||||
|
||||
sog_candidates = []
|
||||
cog_candidates = []
|
||||
|
||||
# Analyze each packet size separately
|
||||
for pkt_size, size_fields in field_data_by_size.items():
|
||||
for path, stats in size_fields.items():
|
||||
if stats.count < 3: # Need at least 3 samples
|
||||
continue
|
||||
|
||||
if stats.is_sog_candidate():
|
||||
# Add packet size info to path for clarity
|
||||
stats.path = f"{path} ({pkt_size}B)"
|
||||
sog_candidates.append(stats)
|
||||
if stats.is_cog_candidate():
|
||||
stats.path = f"{path} ({pkt_size}B)"
|
||||
cog_candidates.append(stats)
|
||||
|
||||
# Print SOG candidates
|
||||
print("\n*** POTENTIAL SOG FIELDS (speed near zero with fluctuation) ***")
|
||||
print("-" * 70)
|
||||
if sog_candidates:
|
||||
for stats in sorted(sog_candidates, key=lambda s: s.std_dev, reverse=True):
|
||||
min_kts, max_kts, mean_kts = stats.as_knots()
|
||||
print(f" {stats.path}")
|
||||
print(f" Raw (m/s): {stats.min_val:.4f} - {stats.max_val:.4f} "
|
||||
f"(mean: {stats.mean:.4f}, std: {stats.std_dev:.4f})")
|
||||
print(f" As knots: {min_kts:.3f} - {max_kts:.3f} "
|
||||
f"(mean: {mean_kts:.3f})")
|
||||
print()
|
||||
else:
|
||||
print(" (No candidates found)")
|
||||
print(" Try increasing --sog-max or collecting more samples")
|
||||
|
||||
# Print COG candidates
|
||||
print("\n*** POTENTIAL COG FIELDS (angle jumping widely) ***")
|
||||
print("-" * 70)
|
||||
if cog_candidates:
|
||||
for stats in sorted(cog_candidates, key=lambda s: s.range_val, reverse=True):
|
||||
min_deg = (stats.min_val * RAD_TO_DEG) % 360
|
||||
max_deg = (stats.max_val * RAD_TO_DEG) % 360
|
||||
range_deg = stats.range_val * RAD_TO_DEG
|
||||
print(f" {stats.path}")
|
||||
print(f" Raw (rad): {stats.min_val:.4f} - {stats.max_val:.4f} "
|
||||
f"(range: {stats.range_val:.2f} rad, std: {stats.std_dev:.2f})")
|
||||
print(f" As degrees: {min_deg:.1f}° - {max_deg:.1f}° "
|
||||
f"(range: {range_deg:.1f}°)")
|
||||
print()
|
||||
else:
|
||||
print(" (No candidates found)")
|
||||
print(" Try decreasing --cog-range or collecting more samples")
|
||||
|
||||
# Look for paired candidates in the same parent group
|
||||
print("\n*** PAIRED SOG/COG CANDIDATES (same field group) ***")
|
||||
print("-" * 70)
|
||||
|
||||
sog_groups = {find_parent_group(s.path): s for s in sog_candidates}
|
||||
cog_groups = {find_parent_group(s.path): s for s in cog_candidates}
|
||||
|
||||
common_groups = set(sog_groups.keys()) & set(cog_groups.keys())
|
||||
|
||||
if common_groups:
|
||||
for group in sorted(common_groups):
|
||||
sog = sog_groups[group]
|
||||
cog = cog_groups[group]
|
||||
min_kts, max_kts, _ = sog.as_knots()
|
||||
range_deg = cog.range_val * RAD_TO_DEG
|
||||
|
||||
print(f" Field Group {group}:")
|
||||
print(f" SOG: {sog.path}")
|
||||
print(f" {min_kts:.3f} - {max_kts:.3f} kts")
|
||||
print(f" COG: {cog.path}")
|
||||
print(f" Range: {range_deg:.1f}° (std: {cog.std_dev:.2f} rad)")
|
||||
print()
|
||||
else:
|
||||
print(" No paired SOG/COG fields found in the same group")
|
||||
print(" SOG and COG may be in different field groups")
|
||||
|
||||
# Show sample values for top candidates
|
||||
if sog_candidates or cog_candidates:
|
||||
print("\n*** LAST 5 SAMPLE VALUES ***")
|
||||
print("-" * 70)
|
||||
|
||||
if sog_candidates:
|
||||
top_sog = sorted(sog_candidates, key=lambda s: s.std_dev, reverse=True)[0]
|
||||
print(f" Top SOG candidate ({top_sog.path}):")
|
||||
last_vals = top_sog.values[-5:]
|
||||
kts_vals = [v * MS_TO_KTS for v in last_vals]
|
||||
print(f" m/s: {[f'{v:.4f}' for v in last_vals]}")
|
||||
print(f" kts: {[f'{v:.3f}' for v in kts_vals]}")
|
||||
|
||||
if cog_candidates:
|
||||
top_cog = sorted(cog_candidates, key=lambda s: s.range_val, reverse=True)[0]
|
||||
print(f" Top COG candidate ({top_cog.path}):")
|
||||
last_vals = top_cog.values[-5:]
|
||||
deg_vals = [(v * RAD_TO_DEG) % 360 for v in last_vals]
|
||||
print(f" rad: {[f'{v:.4f}' for v in last_vals]}")
|
||||
print(f" deg: {[f'{v:.1f}' for v in deg_vals]}")
|
||||
|
||||
# Show diagnostic info if no candidates found, or if verbose
|
||||
no_candidates = not sog_candidates and not cog_candidates
|
||||
if args.verbose or no_candidates:
|
||||
print("\n*** ALL NUMERIC FIELDS (diagnostic - per packet size) ***")
|
||||
print("-" * 70)
|
||||
print(f" {'Path':<25} {'Type':<5} {'Min':>12} {'Max':>12} {'Range':>12} {'StdDev':>10}")
|
||||
print("-" * 70)
|
||||
|
||||
# Collect all valid fields from per-packet-size data
|
||||
all_fields = []
|
||||
for pkt_size, size_fields in sorted(field_data_by_size.items(), reverse=True):
|
||||
for path, stats in size_fields.items():
|
||||
if stats.count < 3:
|
||||
continue
|
||||
# Create a copy with packet size in path
|
||||
stats_copy = copy(stats)
|
||||
stats_copy.path = f"{path} ({pkt_size}B)"
|
||||
all_fields.append(stats_copy)
|
||||
|
||||
# Sort by range (most variable first)
|
||||
for stats in sorted(all_fields, key=lambda s: s.range_val, reverse=True):
|
||||
print(f" {stats.path:<25} {stats.wire_type:<5} "
|
||||
f"{stats.min_val:>12.4f} {stats.max_val:>12.4f} "
|
||||
f"{stats.range_val:>12.4f} {stats.std_dev:>10.4f}")
|
||||
|
||||
# Show interpretation hints for top variable fields
|
||||
print("\n*** INTERPRETATION HINTS (top 10 most variable fields) ***")
|
||||
print("-" * 70)
|
||||
top_variable = sorted(all_fields, key=lambda s: s.range_val, reverse=True)[:10]
|
||||
|
||||
for stats in top_variable:
|
||||
print(f"\n {stats.path} ({stats.wire_type}):")
|
||||
print(f" Raw: {stats.min_val:.6f} to {stats.max_val:.6f}")
|
||||
|
||||
# Try angle interpretation (radians)
|
||||
if 0 <= stats.min_val and stats.max_val <= 7:
|
||||
min_deg = (stats.min_val * RAD_TO_DEG) % 360
|
||||
max_deg = (stats.max_val * RAD_TO_DEG) % 360
|
||||
range_deg = stats.range_val * RAD_TO_DEG
|
||||
print(f" As angle (rad->deg): {min_deg:.1f}° to {max_deg:.1f}° (range: {range_deg:.1f}°)")
|
||||
|
||||
# Try speed interpretation (m/s)
|
||||
if 0 <= stats.min_val and stats.max_val <= 100:
|
||||
min_kts = stats.min_val * MS_TO_KTS
|
||||
max_kts = stats.max_val * MS_TO_KTS
|
||||
print(f" As speed (m/s->kts): {min_kts:.3f} to {max_kts:.3f} kts")
|
||||
|
||||
# Try temperature interpretation (Kelvin)
|
||||
if 250 <= stats.min_val <= 350:
|
||||
min_c = stats.min_val - 273.15
|
||||
max_c = stats.max_val - 273.15
|
||||
print(f" As temp (K->°C): {min_c:.1f}°C to {max_c:.1f}°C")
|
||||
|
||||
# GPS coordinate check
|
||||
if -180 <= stats.min_val <= 180 and stats.range_val < 1:
|
||||
print(f" Could be GPS coordinate (low variance)")
|
||||
|
||||
if no_candidates:
|
||||
print("\n" + "=" * 70)
|
||||
print("SUGGESTIONS:")
|
||||
print("=" * 70)
|
||||
print(" No automatic matches found. Look at the fields above for:")
|
||||
print(" - SOG: Small values (< 0.5 m/s) with some variance")
|
||||
print(" - COG: Values in 0-6.28 range (radians) with HIGH variance")
|
||||
print()
|
||||
print(" Common issues:")
|
||||
print(" - GPS may not have lock (check for lat/lon)")
|
||||
print(" - Values may be in different units than expected")
|
||||
print(" - Try: --sog-max 1.0 --cog-range 45")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
412
axiom-nmea/debug/field5_study.py
Executable file
412
axiom-nmea/debug/field5_study.py
Executable file
@@ -0,0 +1,412 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Field 5 Study - Comprehensive analysis of Field 5 subfields.
|
||||
|
||||
Field 5 appears to contain SOG/COG data based on dock testing.
|
||||
This script collects extensive samples to document all subfields.
|
||||
|
||||
Usage:
|
||||
python field5_study.py -i 198.18.5.5
|
||||
python field5_study.py -i 198.18.5.5 --samples 100 --interval 0.2
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import math
|
||||
import os
|
||||
import signal
|
||||
import socket
|
||||
import struct
|
||||
import sys
|
||||
import time
|
||||
from collections import defaultdict
|
||||
from dataclasses import dataclass, field
|
||||
from datetime import datetime
|
||||
from typing import Any, Dict, List, Optional, Tuple
|
||||
|
||||
# Add parent directory to path for library import
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from raymarine_nmea.protocol.parser import ProtobufParser, ProtoField
|
||||
from raymarine_nmea.protocol.constants import (
|
||||
WIRE_VARINT, WIRE_FIXED64, WIRE_LENGTH, WIRE_FIXED32,
|
||||
HEADER_SIZE, RAD_TO_DEG, MS_TO_KTS,
|
||||
)
|
||||
from raymarine_nmea.sensors import MULTICAST_GROUPS
|
||||
|
||||
running = True
|
||||
|
||||
|
||||
def signal_handler(signum, frame):
|
||||
global running
|
||||
running = False
|
||||
|
||||
|
||||
@dataclass
|
||||
class FieldStats:
|
||||
"""Statistics for a field across multiple samples."""
|
||||
path: str
|
||||
wire_type: str
|
||||
values: List[float] = field(default_factory=list)
|
||||
|
||||
@property
|
||||
def count(self) -> int:
|
||||
return len(self.values)
|
||||
|
||||
@property
|
||||
def min_val(self) -> float:
|
||||
return min(self.values) if self.values else 0
|
||||
|
||||
@property
|
||||
def max_val(self) -> float:
|
||||
return max(self.values) if self.values else 0
|
||||
|
||||
@property
|
||||
def range_val(self) -> float:
|
||||
return self.max_val - self.min_val
|
||||
|
||||
@property
|
||||
def mean(self) -> float:
|
||||
return sum(self.values) / len(self.values) if self.values else 0
|
||||
|
||||
@property
|
||||
def std_dev(self) -> float:
|
||||
if len(self.values) < 2:
|
||||
return 0
|
||||
mean = self.mean
|
||||
variance = sum((v - mean) ** 2 for v in self.values) / len(self.values)
|
||||
return math.sqrt(variance)
|
||||
|
||||
|
||||
def decode_float(raw: bytes) -> Optional[float]:
|
||||
"""Decode 4 bytes as little-endian float."""
|
||||
if len(raw) == 4:
|
||||
try:
|
||||
val = struct.unpack('<f', raw)[0]
|
||||
if val == val: # NaN check
|
||||
return val
|
||||
except struct.error:
|
||||
pass
|
||||
return None
|
||||
|
||||
|
||||
def decode_double(raw: bytes) -> Optional[float]:
|
||||
"""Decode 8 bytes as little-endian double."""
|
||||
if len(raw) == 8:
|
||||
try:
|
||||
val = struct.unpack('<d', raw)[0]
|
||||
if val == val: # NaN check
|
||||
return val
|
||||
except struct.error:
|
||||
pass
|
||||
return None
|
||||
|
||||
|
||||
def extract_field5(packet: bytes) -> Dict[str, Tuple[str, float]]:
|
||||
"""Extract all Field 5 subfields from a packet."""
|
||||
results = {}
|
||||
if len(packet) < HEADER_SIZE + 10:
|
||||
return results
|
||||
|
||||
proto_data = packet[HEADER_SIZE:]
|
||||
parser = ProtobufParser(proto_data)
|
||||
fields = parser.parse_message()
|
||||
|
||||
# Look for Field 5
|
||||
if 5 not in fields:
|
||||
return results
|
||||
|
||||
field5 = fields[5]
|
||||
|
||||
# If Field 5 is a nested message, extract children
|
||||
if field5.children:
|
||||
for child_num, child in field5.children.items():
|
||||
path = f"5.{child_num}"
|
||||
|
||||
if child.wire_type == WIRE_FIXED32:
|
||||
val = decode_float(child.value)
|
||||
if val is not None:
|
||||
results[path] = ('f32', val)
|
||||
|
||||
elif child.wire_type == WIRE_FIXED64:
|
||||
val = decode_double(child.value)
|
||||
if val is not None:
|
||||
results[path] = ('f64', val)
|
||||
|
||||
elif child.wire_type == WIRE_VARINT:
|
||||
results[path] = ('var', float(child.value))
|
||||
|
||||
# Check for deeper nesting
|
||||
if child.children:
|
||||
for subchild_num, subchild in child.children.items():
|
||||
subpath = f"5.{child_num}.{subchild_num}"
|
||||
if subchild.wire_type == WIRE_FIXED32:
|
||||
val = decode_float(subchild.value)
|
||||
if val is not None:
|
||||
results[subpath] = ('f32', val)
|
||||
elif subchild.wire_type == WIRE_FIXED64:
|
||||
val = decode_double(subchild.value)
|
||||
if val is not None:
|
||||
results[subpath] = ('f64', val)
|
||||
|
||||
# Field 5 itself might be a scalar
|
||||
elif field5.wire_type == WIRE_FIXED32:
|
||||
val = decode_float(field5.value)
|
||||
if val is not None:
|
||||
results['5'] = ('f32', val)
|
||||
|
||||
elif field5.wire_type == WIRE_FIXED64:
|
||||
val = decode_double(field5.value)
|
||||
if val is not None:
|
||||
results['5'] = ('f64', val)
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def interpret_value(val: float, wire_type: str) -> Dict[str, str]:
|
||||
"""Generate possible interpretations of a value."""
|
||||
interps = {}
|
||||
|
||||
# Angle (radians to degrees)
|
||||
if 0 <= val <= 2 * math.pi + 0.5:
|
||||
deg = (val * RAD_TO_DEG) % 360
|
||||
interps['angle'] = f"{deg:.1f}°"
|
||||
|
||||
# Speed (m/s to knots)
|
||||
if 0 <= val <= 100:
|
||||
kts = val * MS_TO_KTS
|
||||
interps['speed'] = f"{kts:.2f} kts"
|
||||
|
||||
# Small angle (degrees already)
|
||||
if 0 <= val <= 360:
|
||||
interps['deg_direct'] = f"{val:.1f}° (if already degrees)"
|
||||
|
||||
# Temperature (Kelvin)
|
||||
if 250 <= val <= 350:
|
||||
c = val - 273.15
|
||||
interps['temp'] = f"{c:.1f}°C"
|
||||
|
||||
return interps
|
||||
|
||||
|
||||
def main():
|
||||
global running
|
||||
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Study Field 5 subfields comprehensively",
|
||||
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||
)
|
||||
parser.add_argument('-i', '--interface', required=True,
|
||||
help='Interface IP for Raymarine multicast')
|
||||
parser.add_argument('-n', '--samples', type=int, default=50,
|
||||
help='Number of samples to collect (default: 50)')
|
||||
parser.add_argument('--interval', type=float, default=0.3,
|
||||
help='Seconds between samples (default: 0.3)')
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
signal.signal(signal.SIGINT, signal_handler)
|
||||
signal.signal(signal.SIGTERM, signal_handler)
|
||||
|
||||
# Create sockets
|
||||
sockets = []
|
||||
for group, port in MULTICAST_GROUPS:
|
||||
try:
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
if hasattr(socket, 'SO_REUSEPORT'):
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
|
||||
sock.bind(('', port))
|
||||
mreq = struct.pack("4s4s", socket.inet_aton(group), socket.inet_aton(args.interface))
|
||||
sock.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, mreq)
|
||||
sock.setblocking(False)
|
||||
sockets.append((sock, group, port))
|
||||
except Exception as e:
|
||||
print(f"Warning: Could not join {group}:{port}: {e}")
|
||||
|
||||
if not sockets:
|
||||
print("Error: Could not join any multicast groups")
|
||||
sys.exit(1)
|
||||
|
||||
print("=" * 80)
|
||||
print("FIELD 5 COMPREHENSIVE STUDY")
|
||||
print("=" * 80)
|
||||
print(f"Collecting {args.samples} samples at {args.interval}s intervals...")
|
||||
print("-" * 80)
|
||||
|
||||
# Track by packet size
|
||||
field5_by_size: Dict[int, Dict[str, FieldStats]] = defaultdict(dict)
|
||||
packets_with_field5 = 0
|
||||
packets_without_field5 = 0
|
||||
total_packets = 0
|
||||
last_sample_time_by_size: Dict[int, float] = defaultdict(float)
|
||||
|
||||
try:
|
||||
samples_collected = 0
|
||||
while running and samples_collected < args.samples:
|
||||
for sock, group, port in sockets:
|
||||
try:
|
||||
data, addr = sock.recvfrom(65535)
|
||||
pkt_size = len(data)
|
||||
total_packets += 1
|
||||
|
||||
now = time.time()
|
||||
if (now - last_sample_time_by_size[pkt_size]) < args.interval:
|
||||
continue
|
||||
|
||||
results = extract_field5(data)
|
||||
|
||||
if results:
|
||||
packets_with_field5 += 1
|
||||
samples_collected += 1
|
||||
last_sample_time_by_size[pkt_size] = now
|
||||
|
||||
for path, (wire_type, value) in results.items():
|
||||
size_fields = field5_by_size[pkt_size]
|
||||
if path not in size_fields:
|
||||
size_fields[path] = FieldStats(path, wire_type, [])
|
||||
size_fields[path].values.append(value)
|
||||
|
||||
pct = (samples_collected / args.samples) * 100
|
||||
print(f"\r Collecting: {samples_collected}/{args.samples} ({pct:.0f}%)", end='', flush=True)
|
||||
else:
|
||||
packets_without_field5 += 1
|
||||
|
||||
except BlockingIOError:
|
||||
continue
|
||||
|
||||
time.sleep(0.01)
|
||||
|
||||
finally:
|
||||
for sock, _, _ in sockets:
|
||||
sock.close()
|
||||
|
||||
print()
|
||||
print()
|
||||
|
||||
# Summary
|
||||
print("=" * 80)
|
||||
print("FIELD 5 STUDY RESULTS")
|
||||
print("=" * 80)
|
||||
print(f" Total packets scanned: {total_packets}")
|
||||
print(f" Packets with Field 5: {packets_with_field5}")
|
||||
print(f" Packets without Field 5: {packets_without_field5}")
|
||||
print()
|
||||
|
||||
if not field5_by_size:
|
||||
print(" No Field 5 data found!")
|
||||
sys.exit(1)
|
||||
|
||||
# Show Field 5 structure by packet size
|
||||
print("=" * 80)
|
||||
print("FIELD 5 SUBFIELDS BY PACKET SIZE")
|
||||
print("=" * 80)
|
||||
|
||||
all_subfields = set()
|
||||
for pkt_size in sorted(field5_by_size.keys()):
|
||||
size_fields = field5_by_size[pkt_size]
|
||||
all_subfields.update(size_fields.keys())
|
||||
|
||||
# For each packet size that has Field 5
|
||||
for pkt_size in sorted(field5_by_size.keys()):
|
||||
size_fields = field5_by_size[pkt_size]
|
||||
if not size_fields:
|
||||
continue
|
||||
|
||||
sample_count = max(s.count for s in size_fields.values())
|
||||
print(f"\n--- {pkt_size} bytes ({sample_count} samples) ---")
|
||||
print()
|
||||
|
||||
for path in sorted(size_fields.keys(), key=lambda x: [int(p) for p in x.split('.')]):
|
||||
stats = size_fields[path]
|
||||
|
||||
print(f" {path} ({stats.wire_type}):")
|
||||
print(f" Samples: {stats.count}")
|
||||
print(f" Range: {stats.min_val:.6f} to {stats.max_val:.6f}")
|
||||
print(f" Mean: {stats.mean:.6f}")
|
||||
print(f" StdDev: {stats.std_dev:.6f}")
|
||||
|
||||
# Show interpretations
|
||||
interps = interpret_value(stats.mean, stats.wire_type)
|
||||
if interps:
|
||||
print(f" Interpretations:")
|
||||
for itype, ival in interps.items():
|
||||
print(f" - As {itype}: {ival}")
|
||||
|
||||
# Behavioral analysis
|
||||
if stats.std_dev < 0.001 and stats.count >= 3:
|
||||
print(f" Behavior: CONSTANT")
|
||||
elif stats.range_val > 3.0 and 0 <= stats.min_val <= 7:
|
||||
range_deg = stats.range_val * RAD_TO_DEG
|
||||
print(f" Behavior: HIGHLY VARIABLE ({range_deg:.0f}° range) - likely COG or heading")
|
||||
elif stats.range_val > 0.01 and stats.max_val < 1.0:
|
||||
range_kts = stats.range_val * MS_TO_KTS
|
||||
print(f" Behavior: SMALL FLUCTUATION ({range_kts:.3f} kts range) - could be SOG")
|
||||
|
||||
print()
|
||||
|
||||
# Summary table
|
||||
print("=" * 80)
|
||||
print("FIELD 5 SUMMARY TABLE")
|
||||
print("=" * 80)
|
||||
print()
|
||||
print(f" {'Subfield':<10} {'Type':<5} {'Min':>12} {'Max':>12} {'StdDev':>10} {'Behavior':<20} {'Likely Purpose'}")
|
||||
print("-" * 95)
|
||||
|
||||
# Aggregate across all packet sizes for the summary
|
||||
aggregated: Dict[str, FieldStats] = {}
|
||||
for pkt_size, size_fields in field5_by_size.items():
|
||||
for path, stats in size_fields.items():
|
||||
if path not in aggregated:
|
||||
aggregated[path] = FieldStats(path, stats.wire_type, [])
|
||||
aggregated[path].values.extend(stats.values)
|
||||
|
||||
for path in sorted(aggregated.keys(), key=lambda x: [int(p) for p in x.split('.')]):
|
||||
stats = aggregated[path]
|
||||
|
||||
# Determine behavior
|
||||
if stats.std_dev < 0.001:
|
||||
behavior = "Constant"
|
||||
elif stats.range_val > 3.0:
|
||||
behavior = f"Variable ({stats.range_val * RAD_TO_DEG:.0f}° range)"
|
||||
elif stats.range_val > 0.01:
|
||||
behavior = f"Fluctuating"
|
||||
else:
|
||||
behavior = "Near-constant"
|
||||
|
||||
# Guess purpose based on behavior and value range
|
||||
purpose = "Unknown"
|
||||
if stats.range_val > 3.0 and 0 <= stats.min_val <= 7:
|
||||
purpose = "COG or heading"
|
||||
elif stats.std_dev < 0.001 and 0.005 <= stats.mean <= 0.5:
|
||||
purpose = "SOG (at dock)"
|
||||
elif stats.std_dev < 0.001 and stats.mean > 10:
|
||||
purpose = "Fixed parameter"
|
||||
elif 0 <= stats.mean <= 0.2 and stats.max_val < 1:
|
||||
purpose = "SOG candidate"
|
||||
|
||||
print(f" {path:<10} {stats.wire_type:<5} {stats.min_val:>12.4f} {stats.max_val:>12.4f} "
|
||||
f"{stats.std_dev:>10.4f} {behavior:<20} {purpose}")
|
||||
|
||||
print()
|
||||
print("=" * 80)
|
||||
print("INTERPRETATION GUIDE")
|
||||
print("=" * 80)
|
||||
print("""
|
||||
Based on dock behavior (SOG ~0, COG jumping wildly):
|
||||
|
||||
- COG (Course Over Ground): Look for fields with HIGH variance spanning
|
||||
most of 0-2π radians (~0-360°). At dock, GPS-derived COG is unreliable
|
||||
and jumps randomly.
|
||||
|
||||
- SOG (Speed Over Ground): Look for fields with small values (~0.01-0.1 m/s)
|
||||
that are relatively constant at dock. May show slight fluctuation.
|
||||
|
||||
- Heading: May be similar to COG but derived from compass, so more stable.
|
||||
|
||||
- Fixed parameters: Constants like 0.05, 0.1, 11.93 may be configuration
|
||||
values, damping factors, or display settings.
|
||||
""")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
496
axiom-nmea/debug/field_debugger.py
Normal file
496
axiom-nmea/debug/field_debugger.py
Normal file
@@ -0,0 +1,496 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Field Debugger - Shows all protobuf fields in columns for mapping real-world values.
|
||||
|
||||
Displays each top-level field as a column, with subfields as rows.
|
||||
Updates every few seconds to show value progression over time.
|
||||
|
||||
Usage:
|
||||
python3 field_debugger.py -i 192.168.1.100 # Live capture
|
||||
python3 field_debugger.py --pcap capture.pcap # From file
|
||||
python3 field_debugger.py --pcap capture.pcap -n 5 # Show 5 snapshots
|
||||
"""
|
||||
|
||||
import struct
|
||||
import socket
|
||||
import time
|
||||
import argparse
|
||||
import threading
|
||||
import sys
|
||||
from datetime import datetime
|
||||
from collections import defaultdict
|
||||
from typing import Dict, List, Any, Optional
|
||||
|
||||
# Wire types
|
||||
WIRE_VARINT = 0
|
||||
WIRE_FIXED64 = 1
|
||||
WIRE_LENGTH = 2
|
||||
WIRE_FIXED32 = 5
|
||||
|
||||
HEADER_SIZE = 20
|
||||
|
||||
MULTICAST_GROUPS = [
|
||||
("226.192.206.98", 2561),
|
||||
("226.192.206.99", 2562),
|
||||
("226.192.206.100", 2563),
|
||||
("226.192.206.101", 2564),
|
||||
("226.192.206.102", 2565),
|
||||
("226.192.219.0", 3221),
|
||||
("239.2.1.1", 2154), # May contain tank/engine data
|
||||
]
|
||||
|
||||
|
||||
class ProtobufParser:
|
||||
"""Parse protobuf without schema."""
|
||||
|
||||
def __init__(self, data: bytes):
|
||||
self.data = data
|
||||
self.pos = 0
|
||||
|
||||
def read_varint(self) -> int:
|
||||
result = 0
|
||||
shift = 0
|
||||
while self.pos < len(self.data):
|
||||
byte = self.data[self.pos]
|
||||
self.pos += 1
|
||||
result |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
return result
|
||||
|
||||
def parse(self) -> Dict[int, Any]:
|
||||
"""Parse message, return dict of field_num -> (wire_type, value, children)."""
|
||||
fields = {}
|
||||
while self.pos < len(self.data):
|
||||
try:
|
||||
start = self.pos
|
||||
tag = self.read_varint()
|
||||
field_num = tag >> 3
|
||||
wire_type = tag & 0x07
|
||||
|
||||
if field_num == 0 or field_num > 1000:
|
||||
break
|
||||
|
||||
if wire_type == WIRE_VARINT:
|
||||
value = self.read_varint()
|
||||
children = None
|
||||
elif wire_type == WIRE_FIXED64:
|
||||
value = self.data[self.pos:self.pos + 8]
|
||||
self.pos += 8
|
||||
children = None
|
||||
elif wire_type == WIRE_LENGTH:
|
||||
length = self.read_varint()
|
||||
value = self.data[self.pos:self.pos + length]
|
||||
self.pos += length
|
||||
# Try to parse as nested
|
||||
try:
|
||||
nested = ProtobufParser(value)
|
||||
children = nested.parse()
|
||||
if nested.pos < len(value) * 0.5:
|
||||
children = None
|
||||
except:
|
||||
children = None
|
||||
elif wire_type == WIRE_FIXED32:
|
||||
value = self.data[self.pos:self.pos + 4]
|
||||
self.pos += 4
|
||||
children = None
|
||||
else:
|
||||
break
|
||||
|
||||
fields[field_num] = (wire_type, value, children)
|
||||
except:
|
||||
break
|
||||
return fields
|
||||
|
||||
|
||||
# Known field labels from reverse engineering
|
||||
FIELD_LABELS = {
|
||||
# Top-level fields
|
||||
(1,): "DeviceInfo",
|
||||
(2,): "GPS",
|
||||
(3,): "HeadingBlock",
|
||||
(7,): "DepthBlock",
|
||||
(8,): "RateOfTurn",
|
||||
(10,): "Unknown10",
|
||||
(12,): "Unknown12",
|
||||
(13,): "WindNav",
|
||||
(14,): "SensorData",
|
||||
(21,): "Angles",
|
||||
|
||||
# Field 1 subfields (Device Info)
|
||||
(1, 1): "DeviceName",
|
||||
(1, 2): "SerialInfo",
|
||||
|
||||
# Field 2 subfields (GPS)
|
||||
(2, 1): "LATITUDE",
|
||||
(2, 2): "LONGITUDE",
|
||||
(2, 3): "Unknown",
|
||||
(2, 4): "Altitude?",
|
||||
(2, 5): "Timestamp?",
|
||||
(2, 6): "Distance?",
|
||||
|
||||
# Field 3 subfields (Heading)
|
||||
(3, 1): "HeadingRaw",
|
||||
(3, 2): "HEADING",
|
||||
|
||||
# Field 7 subfields (Depth) - only in larger packets (1472B+)
|
||||
(7, 1): "DEPTH_M", # Depth in METERS
|
||||
|
||||
# Field 8 subfields
|
||||
(8, 1): "ROT?",
|
||||
(8, 2): "Unknown",
|
||||
|
||||
# Field 13 subfields (Wind/Navigation) - MAIN SENSOR BLOCK
|
||||
(13, 1): "Heading1",
|
||||
(13, 2): "Heading2",
|
||||
(13, 3): "SmallAngle",
|
||||
(13, 4): "TWD", # True Wind Direction
|
||||
(13, 5): "TWS", # True Wind Speed
|
||||
(13, 6): "AWS", # Apparent Wind Speed?
|
||||
(13, 7): "AWD?", # Apparent Wind Direction?
|
||||
(13, 8): "Heading1_dup",
|
||||
(13, 9): "Heading2_dup",
|
||||
(13, 10): "SmallAngle_dup",
|
||||
(13, 11): "TWS_dup",
|
||||
(13, 12): "AWS_dup",
|
||||
(13, 13): "AWD_dup?",
|
||||
|
||||
# Field 21 subfields
|
||||
(21, 1): "Unknown",
|
||||
(21, 2): "Angle1",
|
||||
(21, 3): "Unknown",
|
||||
(21, 4): "Unknown",
|
||||
}
|
||||
|
||||
|
||||
def get_label(field_path: tuple) -> str:
|
||||
"""Get label for a field path, or empty string if unknown."""
|
||||
return FIELD_LABELS.get(field_path, "")
|
||||
|
||||
|
||||
def format_value(wire_type: int, value: Any) -> str:
|
||||
"""Format a protobuf value for display."""
|
||||
if wire_type == WIRE_VARINT:
|
||||
if value > 2**31:
|
||||
return f"v:{value} (0x{value:x})"
|
||||
return f"v:{value}"
|
||||
|
||||
elif wire_type == WIRE_FIXED64:
|
||||
try:
|
||||
d = struct.unpack('<d', value)[0]
|
||||
if d != d: # NaN
|
||||
return "d:NaN"
|
||||
if abs(d) < 0.0001 and d != 0:
|
||||
return f"d:{d:.2e}"
|
||||
if abs(d) > 10000:
|
||||
return f"d:{d:.1f}"
|
||||
if -180 <= d <= 180:
|
||||
return f"d:{d:.6f}"
|
||||
return f"d:{d:.2f}"
|
||||
except:
|
||||
return f"x:{value.hex()[:16]}"
|
||||
|
||||
elif wire_type == WIRE_FIXED32:
|
||||
try:
|
||||
f = struct.unpack('<f', value)[0]
|
||||
if f != f: # NaN
|
||||
return "f:NaN"
|
||||
# Check if could be radians (angle)
|
||||
if 0 <= f <= 6.5:
|
||||
deg = f * 57.2958
|
||||
return f"f:{f:.3f} ({deg:.1f}°)"
|
||||
# Could be speed in m/s
|
||||
if 0 < f < 50:
|
||||
kts = f * 1.94384
|
||||
return f"f:{f:.2f} ({kts:.1f}kt)"
|
||||
return f"f:{f:.3f}"
|
||||
except:
|
||||
return f"x:{value.hex()}"
|
||||
|
||||
elif wire_type == WIRE_LENGTH:
|
||||
# Try as string
|
||||
try:
|
||||
s = value.decode('ascii')
|
||||
if all(32 <= ord(c) < 127 for c in s):
|
||||
if len(s) > 15:
|
||||
return f's:"{s[:12]}..."'
|
||||
return f's:"{s}"'
|
||||
except:
|
||||
pass
|
||||
return f"[{len(value)}B]"
|
||||
|
||||
return "?"
|
||||
|
||||
|
||||
def extract_fields(packet: bytes) -> Optional[Dict[int, Any]]:
|
||||
"""Extract all fields from a packet."""
|
||||
if len(packet) < HEADER_SIZE + 10:
|
||||
return None
|
||||
|
||||
proto_data = packet[HEADER_SIZE:]
|
||||
parser = ProtobufParser(proto_data)
|
||||
return parser.parse()
|
||||
|
||||
|
||||
def print_snapshot(fields: Dict[int, Any], timestamp: str, packet_size: int):
|
||||
"""Print a snapshot of all fields in columnar format."""
|
||||
# Build column data
|
||||
columns = {} # field_num -> list of (subfield_path, value_str)
|
||||
|
||||
def process_field(field_num: int, wire_type: int, value: Any, children: Optional[Dict], prefix: str = ""):
|
||||
col_key = field_num
|
||||
if col_key not in columns:
|
||||
columns[col_key] = []
|
||||
|
||||
if children:
|
||||
# Has subfields
|
||||
columns[col_key].append((f"{prefix}", "[msg]"))
|
||||
for sub_num, (sub_wt, sub_val, sub_children) in sorted(children.items()):
|
||||
label = get_label((field_num, sub_num))
|
||||
label_str = f" {label}" if label else ""
|
||||
val_str = format_value(sub_wt, sub_val)
|
||||
columns[col_key].append((f" .{sub_num}{label_str}", val_str))
|
||||
|
||||
# Go one level deeper for nested messages
|
||||
if sub_children:
|
||||
for subsub_num, (subsub_wt, subsub_val, _) in sorted(sub_children.items()):
|
||||
val_str2 = format_value(subsub_wt, subsub_val)
|
||||
columns[col_key].append((f" .{sub_num}.{subsub_num}", val_str2))
|
||||
else:
|
||||
val_str = format_value(wire_type, value)
|
||||
columns[col_key].append((prefix or "val", val_str))
|
||||
|
||||
# Process all top-level fields
|
||||
for field_num, (wire_type, value, children) in sorted(fields.items()):
|
||||
process_field(field_num, wire_type, value, children)
|
||||
|
||||
# Print header
|
||||
print("\n" + "=" * 100)
|
||||
print(f" {timestamp} | Packet: {packet_size} bytes | Fields: {len(fields)}")
|
||||
print("=" * 100)
|
||||
|
||||
# Determine column layout
|
||||
col_nums = sorted(columns.keys())
|
||||
if not col_nums:
|
||||
print(" No fields decoded")
|
||||
return
|
||||
|
||||
# Calculate column widths
|
||||
col_width = 28
|
||||
cols_per_row = min(4, len(col_nums))
|
||||
|
||||
# Print columns in groups
|
||||
for start_idx in range(0, len(col_nums), cols_per_row):
|
||||
group_cols = col_nums[start_idx:start_idx + cols_per_row]
|
||||
|
||||
# Header row with labels
|
||||
header = ""
|
||||
for col_num in group_cols:
|
||||
label = get_label((col_num,))
|
||||
if label:
|
||||
hdr_text = f"F{col_num} {label}"
|
||||
else:
|
||||
hdr_text = f"Field {col_num}"
|
||||
header += f"| {hdr_text:<{col_width - 1}}"
|
||||
print(header + "|")
|
||||
print("-" * (len(group_cols) * (col_width + 2) + 1))
|
||||
|
||||
# Find max rows needed
|
||||
max_rows = max(len(columns[c]) for c in group_cols)
|
||||
|
||||
# Print rows
|
||||
for row_idx in range(max_rows):
|
||||
row = ""
|
||||
for col_num in group_cols:
|
||||
col_data = columns[col_num]
|
||||
if row_idx < len(col_data):
|
||||
path, val = col_data[row_idx]
|
||||
cell = f"{path}: {val}"
|
||||
if len(cell) > col_width - 1:
|
||||
cell = cell[:col_width - 4] + "..."
|
||||
row += f"| {cell:<{col_width - 1}}"
|
||||
else:
|
||||
row += f"| {'':<{col_width - 1}}"
|
||||
print(row + "|")
|
||||
|
||||
print()
|
||||
|
||||
|
||||
def read_pcap(filename: str) -> List[bytes]:
|
||||
"""Read packets from pcap file."""
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append((ts_sec + ts_usec / 1e6, pkt_data[payload_start:]))
|
||||
return packets
|
||||
|
||||
|
||||
class LiveListener:
|
||||
"""Listen for live packets."""
|
||||
|
||||
def __init__(self, interface_ip: str):
|
||||
self.interface_ip = interface_ip
|
||||
self.running = False
|
||||
self.packets_by_group = {} # (group, port) -> (packet, size)
|
||||
self.lock = threading.Lock()
|
||||
|
||||
def _create_socket(self, group: str, port: int):
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
if hasattr(socket, 'SO_REUSEPORT'):
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
|
||||
sock.bind(('', port))
|
||||
mreq = struct.pack("4s4s", socket.inet_aton(group), socket.inet_aton(self.interface_ip))
|
||||
sock.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, mreq)
|
||||
sock.settimeout(1.0)
|
||||
return sock
|
||||
|
||||
def _listen(self, sock, group: str, port: int):
|
||||
key = (group, port)
|
||||
while self.running:
|
||||
try:
|
||||
data, _ = sock.recvfrom(65535)
|
||||
# Keep packets with protobuf payload (header + minimal data)
|
||||
if len(data) >= 40:
|
||||
with self.lock:
|
||||
self.packets_by_group[key] = data
|
||||
except socket.timeout:
|
||||
continue
|
||||
except:
|
||||
pass
|
||||
|
||||
def start(self):
|
||||
self.running = True
|
||||
for group, port in MULTICAST_GROUPS:
|
||||
try:
|
||||
sock = self._create_socket(group, port)
|
||||
t = threading.Thread(target=self._listen, args=(sock, group, port), daemon=True)
|
||||
t.start()
|
||||
print(f"Listening on {group}:{port}")
|
||||
except Exception as e:
|
||||
print(f"Error: {e}")
|
||||
|
||||
def get_all_packets(self) -> Dict[tuple, bytes]:
|
||||
"""Return dict of (group, port) -> packet for all groups with data."""
|
||||
with self.lock:
|
||||
return dict(self.packets_by_group)
|
||||
|
||||
def stop(self):
|
||||
self.running = False
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(description="Field Debugger - Map protobuf fields to real values")
|
||||
parser.add_argument('-i', '--interface', help='Interface IP for live capture')
|
||||
parser.add_argument('--pcap', help='Read from pcap file')
|
||||
parser.add_argument('-n', '--num-snapshots', type=int, default=10, help='Number of snapshots to show')
|
||||
parser.add_argument('-t', '--interval', type=float, default=3.0, help='Seconds between snapshots')
|
||||
parser.add_argument('-s', '--size', type=int, help='Only show packets of this size')
|
||||
args = parser.parse_args()
|
||||
|
||||
if not args.pcap and not args.interface:
|
||||
parser.error("Either --interface or --pcap required")
|
||||
|
||||
print("Field Debugger - Protobuf Field Mapper")
|
||||
print("=" * 50)
|
||||
print("Legend:")
|
||||
print(" v:N = varint (integer)")
|
||||
print(" d:N = double (64-bit float)")
|
||||
print(" f:N (X°) = float as radians -> degrees")
|
||||
print(" f:N (Xkt) = float as m/s -> knots")
|
||||
print(" s:\"...\" = string")
|
||||
print(" [NB] = N bytes (nested message)")
|
||||
print("=" * 50)
|
||||
|
||||
if args.pcap:
|
||||
# Read from pcap
|
||||
print(f"\nReading {args.pcap}...")
|
||||
packets = read_pcap(args.pcap)
|
||||
print(f"Loaded {len(packets)} packets")
|
||||
|
||||
# Filter by size if requested
|
||||
if args.size:
|
||||
packets = [(ts, p) for ts, p in packets if len(p) == args.size]
|
||||
print(f"Filtered to {len(packets)} packets of size {args.size}")
|
||||
|
||||
# Group by size
|
||||
by_size = defaultdict(list)
|
||||
for ts, pkt in packets:
|
||||
by_size[len(pkt)].append((ts, pkt))
|
||||
|
||||
print(f"\nPacket sizes: {sorted(by_size.keys())}")
|
||||
|
||||
# Show snapshots from packets with sensor data
|
||||
target_sizes = [s for s in sorted(by_size.keys()) if s >= 300]
|
||||
if not target_sizes:
|
||||
print("No packets >= 300 bytes found")
|
||||
return
|
||||
|
||||
# Pick largest sensor packets
|
||||
target_size = target_sizes[-1] if not args.size else args.size
|
||||
target_packets = by_size.get(target_size, [])
|
||||
|
||||
if not target_packets:
|
||||
print(f"No packets of size {target_size}")
|
||||
return
|
||||
|
||||
# Show snapshots at intervals through the capture
|
||||
step = max(1, len(target_packets) // args.num_snapshots)
|
||||
for i in range(0, len(target_packets), step):
|
||||
if i // step >= args.num_snapshots:
|
||||
break
|
||||
ts, pkt = target_packets[i]
|
||||
fields = extract_fields(pkt)
|
||||
if fields:
|
||||
timestamp = datetime.fromtimestamp(ts).strftime("%H:%M:%S.%f")[:-3]
|
||||
print_snapshot(fields, timestamp, len(pkt))
|
||||
|
||||
else:
|
||||
# Live capture
|
||||
listener = LiveListener(args.interface)
|
||||
listener.start()
|
||||
|
||||
print(f"\nShowing {args.num_snapshots} snapshots, {args.interval}s apart")
|
||||
print("Press Ctrl+C to stop\n")
|
||||
|
||||
try:
|
||||
for i in range(args.num_snapshots):
|
||||
time.sleep(args.interval)
|
||||
all_packets = listener.get_all_packets()
|
||||
if all_packets:
|
||||
timestamp = datetime.now().strftime("%H:%M:%S.%f")[:-3]
|
||||
for (group, port), pkt in sorted(all_packets.items()):
|
||||
if args.size and len(pkt) != args.size:
|
||||
continue
|
||||
fields = extract_fields(pkt)
|
||||
if fields:
|
||||
header = f"{group}:{port}"
|
||||
print_snapshot(fields, f"{timestamp} [{header}]", len(pkt))
|
||||
else:
|
||||
print(f"[{i+1}] No packets received yet...")
|
||||
except KeyboardInterrupt:
|
||||
print("\nStopped")
|
||||
finally:
|
||||
listener.stop()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
156
axiom-nmea/debug/field_mapping.py
Normal file
156
axiom-nmea/debug/field_mapping.py
Normal file
@@ -0,0 +1,156 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Extract and display the field mapping for sensor data.
|
||||
Based on protobuf structure analysis.
|
||||
"""
|
||||
|
||||
import struct
|
||||
|
||||
def read_pcap(filename):
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append(pkt_data[payload_start:])
|
||||
return packets
|
||||
|
||||
# Based on protobuf analysis, here's the structure:
|
||||
STRUCTURE = """
|
||||
================================================================================
|
||||
RAYMARINE PACKET STRUCTURE (from protobuf analysis)
|
||||
================================================================================
|
||||
|
||||
FIXED HEADER (20 bytes @ 0x0000-0x0013):
|
||||
0x0000-0x0007: Packet identifier (00 00 00 00 00 00 00 01)
|
||||
0x0008-0x000B: Source ID
|
||||
0x000C-0x000F: Message type indicator
|
||||
0x0010-0x0013: Payload length
|
||||
|
||||
PROTOBUF MESSAGE (starts @ 0x0014):
|
||||
|
||||
Field 1 (length-delim): Device Info
|
||||
└─ Field 1: Device name ("AXIOM 12")
|
||||
└─ Field 2: Serial number info
|
||||
|
||||
Field 2 (length-delim): GPS/Position Data
|
||||
├─ Field 1 (fixed64/double): LATITUDE
|
||||
├─ Field 2 (fixed64/double): LONGITUDE
|
||||
├─ Field 3 (fixed64/double): (unknown, often NaN)
|
||||
├─ Field 4 (fixed64/double): (altitude or similar)
|
||||
├─ Field 5 (fixed64/double): (timestamp or distance)
|
||||
└─ Field 6 (fixed64/double): (timestamp or distance)
|
||||
|
||||
Field 3 (length-delim): Heading Block
|
||||
├─ Field 1 (fixed32/float): Heading value 1 (radians)
|
||||
└─ Field 2 (fixed32/float): HEADING (radians) ← ~31-33°
|
||||
|
||||
Field 6 (length-delim): [only in 446+ byte packets]
|
||||
└─ Field 1 (fixed32/float): (unknown angle)
|
||||
|
||||
Field 8 (length-delim): Rate/Motion Data
|
||||
├─ Field 1 (fixed32/float): Rate of turn?
|
||||
└─ Field 2 (fixed32/float): (often NaN)
|
||||
|
||||
Field 13 (length-delim): WIND/NAVIGATION DATA ← Main sensor block
|
||||
├─ Field 1 (fixed32/float): Heading copy (radians)
|
||||
├─ Field 2 (fixed32/float): Heading smoothed (radians)
|
||||
├─ Field 3 (fixed32/float): (small angle, ~0°)
|
||||
├─ Field 4 (fixed32/float): TRUE WIND DIRECTION (radians) ← ~62-70°
|
||||
├─ Field 5 (fixed32/float): WIND SPEED (m/s) ← ~7.26 = 14.1 kts
|
||||
├─ Field 6 (fixed32/float): Wind speed (different value)
|
||||
├─ Field 7 (fixed32/float): (large angle ~245°)
|
||||
├─ Field 8-13: Duplicates/smoothed values of above
|
||||
|
||||
Field 14 (length-delim): Additional sensor data (multiple instances)
|
||||
Field 21 (length-delim): More angles
|
||||
Field 38, 41: Empty/reserved
|
||||
|
||||
================================================================================
|
||||
KEY SENSOR MAPPINGS
|
||||
================================================================================
|
||||
|
||||
SENSOR PARENT FIELD CHILD FIELD UNIT
|
||||
------------------ ------------- ----------- --------
|
||||
Latitude Field 2 Field 1 degrees (double)
|
||||
Longitude Field 2 Field 2 degrees (double)
|
||||
Heading Field 3 Field 2 radians (float)
|
||||
True Wind Dir Field 13 Field 4 radians (float)
|
||||
Wind Speed Field 13 Field 5 m/s (float)
|
||||
|
||||
================================================================================
|
||||
"""
|
||||
|
||||
print(STRUCTURE)
|
||||
|
||||
# Now verify with actual data
|
||||
print("VERIFICATION WITH ACTUAL DATA:")
|
||||
print("=" * 70)
|
||||
|
||||
packets = read_pcap("raymarine_sample_TWD_62-70_HDG_29-35.pcap")
|
||||
|
||||
# Get a 344-byte packet
|
||||
for pkt in packets:
|
||||
if len(pkt) == 344:
|
||||
# Skip 20-byte header, protobuf starts at 0x14
|
||||
proto = pkt[0x14:]
|
||||
|
||||
print(f"\n344-byte packet analysis:")
|
||||
print(f" Expected: TWD 62-70°, Heading 29-35°")
|
||||
|
||||
# Field 3 starts around offset 0x54 in original packet = 0x40 in proto
|
||||
# But we need to navigate by protobuf structure
|
||||
|
||||
# Let's use the known byte offsets and verify
|
||||
# From earlier analysis:
|
||||
# - 0x0070 had heading ~32.6°
|
||||
# - 0x00a0 had TWD ~61.7°
|
||||
|
||||
def get_float(data, offset):
|
||||
if offset + 4 <= len(data):
|
||||
return struct.unpack('<f', data[offset:offset+4])[0]
|
||||
return None
|
||||
|
||||
# Check field 3 (heading block) - starts around 0x54
|
||||
# Field 3, field 2 should be heading
|
||||
heading_offset = 0x0070 # From earlier analysis
|
||||
heading_val = get_float(pkt, heading_offset)
|
||||
if heading_val:
|
||||
heading_deg = heading_val * 57.2958
|
||||
print(f" Heading @ 0x{heading_offset:04x}: {heading_val:.4f} rad = {heading_deg:.1f}°")
|
||||
|
||||
# Check field 13, field 4 (TWD)
|
||||
twd_offset = 0x00a0 # From earlier analysis
|
||||
twd_val = get_float(pkt, twd_offset)
|
||||
if twd_val:
|
||||
twd_deg = twd_val * 57.2958
|
||||
print(f" TWD @ 0x{twd_offset:04x}: {twd_val:.4f} rad = {twd_deg:.1f}°")
|
||||
|
||||
# Check field 13, field 5 (wind speed)
|
||||
ws_offset = 0x00a5 # Should be right after TWD
|
||||
ws_val = get_float(pkt, ws_offset)
|
||||
if ws_val:
|
||||
ws_kts = ws_val * 1.94384
|
||||
print(f" Wind Speed @ 0x{ws_offset:04x}: {ws_val:.2f} m/s = {ws_kts:.1f} kts")
|
||||
|
||||
break
|
||||
|
||||
print("\n" + "=" * 70)
|
||||
print("CONCLUSION: The protobuf structure explains the 'random' offsets:")
|
||||
print(" - Same field numbers have different byte offsets per packet size")
|
||||
print(" - This is because preceding fields vary in length")
|
||||
print(" - Solution: Parse protobuf structure, not fixed byte offsets")
|
||||
print("=" * 70)
|
||||
340
axiom-nmea/debug/find_cog_sog.py
Executable file
340
axiom-nmea/debug/find_cog_sog.py
Executable file
@@ -0,0 +1,340 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
COG/SOG Field Finder
|
||||
|
||||
Searches all protobuf fields for values that match expected COG and SOG ranges.
|
||||
Helps identify which fields contain navigation data.
|
||||
|
||||
Usage:
|
||||
python find_cog_sog.py -i 198.18.5.5 --cog-min 0 --cog-max 359 --sog-min 0 --sog-max 0.5
|
||||
python find_cog_sog.py -i 198.18.5.5 --show-all # Show ALL numeric fields
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import os
|
||||
import signal
|
||||
import socket
|
||||
import struct
|
||||
import sys
|
||||
import time
|
||||
from datetime import datetime
|
||||
from typing import Dict, Any, Optional, List, Tuple
|
||||
|
||||
# Add parent directory to path for library import
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from raymarine_nmea.protocol.parser import ProtobufParser, ProtoField
|
||||
from raymarine_nmea.protocol.constants import (
|
||||
WIRE_VARINT, WIRE_FIXED64, WIRE_LENGTH, WIRE_FIXED32,
|
||||
HEADER_SIZE, RAD_TO_DEG, MS_TO_KTS,
|
||||
)
|
||||
from raymarine_nmea.sensors import MULTICAST_GROUPS
|
||||
|
||||
running = True
|
||||
|
||||
|
||||
def signal_handler(signum, frame):
|
||||
global running
|
||||
running = False
|
||||
|
||||
|
||||
def decode_float(raw: bytes) -> Optional[float]:
|
||||
"""Decode 4 bytes as float."""
|
||||
if len(raw) == 4:
|
||||
try:
|
||||
val = struct.unpack('<f', raw)[0]
|
||||
if val == val: # NaN check
|
||||
return val
|
||||
except struct.error:
|
||||
pass
|
||||
return None
|
||||
|
||||
|
||||
def decode_double(raw: bytes) -> Optional[float]:
|
||||
"""Decode 8 bytes as double."""
|
||||
if len(raw) == 8:
|
||||
try:
|
||||
val = struct.unpack('<d', raw)[0]
|
||||
if val == val: # NaN check
|
||||
return val
|
||||
except struct.error:
|
||||
pass
|
||||
return None
|
||||
|
||||
|
||||
def get_interpretations(val: float) -> Dict[str, float]:
|
||||
"""Get all possible interpretations of a numeric value."""
|
||||
interps = {}
|
||||
|
||||
# Angle interpretations (radians to degrees)
|
||||
if 0 <= val <= 6.5:
|
||||
interps['deg'] = (val * RAD_TO_DEG) % 360
|
||||
|
||||
# Speed interpretations (m/s to knots)
|
||||
if 0 <= val <= 100:
|
||||
interps['kts'] = val * MS_TO_KTS
|
||||
|
||||
return interps
|
||||
|
||||
|
||||
def check_cog_match(val: float, cog_min: float, cog_max: float) -> Optional[float]:
|
||||
"""Check if value could be COG in radians. Returns degrees or None."""
|
||||
if 0 <= val <= 6.5: # Valid radian range
|
||||
deg = (val * RAD_TO_DEG) % 360
|
||||
# Handle wrap-around
|
||||
if cog_min <= cog_max:
|
||||
if cog_min <= deg <= cog_max:
|
||||
return deg
|
||||
else: # wrap-around case like 350-10
|
||||
if deg >= cog_min or deg <= cog_max:
|
||||
return deg
|
||||
return None
|
||||
|
||||
|
||||
def check_sog_match(val: float, sog_min: float, sog_max: float) -> Optional[float]:
|
||||
"""Check if value could be SOG in m/s. Returns knots or None."""
|
||||
if 0 <= val <= 50: # Reasonable m/s range
|
||||
kts = val * MS_TO_KTS
|
||||
if sog_min <= kts <= sog_max:
|
||||
return kts
|
||||
return None
|
||||
|
||||
|
||||
def scan_all_fields(pf: ProtoField, path: str, results: List[Dict]):
|
||||
"""Recursively scan ALL fields and collect numeric values."""
|
||||
|
||||
if pf.wire_type == WIRE_FIXED32:
|
||||
val = decode_float(pf.value)
|
||||
if val is not None:
|
||||
interps = get_interpretations(val)
|
||||
results.append({
|
||||
'path': path,
|
||||
'wire': 'f32',
|
||||
'raw': val,
|
||||
'interps': interps
|
||||
})
|
||||
|
||||
elif pf.wire_type == WIRE_FIXED64:
|
||||
val = decode_double(pf.value)
|
||||
if val is not None:
|
||||
interps = get_interpretations(val)
|
||||
results.append({
|
||||
'path': path,
|
||||
'wire': 'f64',
|
||||
'raw': val,
|
||||
'interps': interps
|
||||
})
|
||||
|
||||
elif pf.wire_type == WIRE_VARINT:
|
||||
val = float(pf.value)
|
||||
interps = get_interpretations(val)
|
||||
results.append({
|
||||
'path': path,
|
||||
'wire': 'var',
|
||||
'raw': pf.value,
|
||||
'interps': interps
|
||||
})
|
||||
|
||||
# Recurse into children
|
||||
if pf.children:
|
||||
for child_num, child in pf.children.items():
|
||||
scan_all_fields(child, f"{path}.{child_num}", results)
|
||||
|
||||
|
||||
def scan_packet(packet: bytes) -> List[Dict]:
|
||||
"""Scan a packet and return ALL numeric fields."""
|
||||
results = []
|
||||
if len(packet) < HEADER_SIZE + 10:
|
||||
return results
|
||||
|
||||
proto_data = packet[HEADER_SIZE:]
|
||||
parser = ProtobufParser(proto_data)
|
||||
fields = parser.parse_message(collect_repeated={14, 16, 20})
|
||||
|
||||
for field_num, val in fields.items():
|
||||
if isinstance(val, list):
|
||||
for i, pf in enumerate(val):
|
||||
scan_all_fields(pf, f"{field_num}[{i}]", results)
|
||||
else:
|
||||
scan_all_fields(val, f"{field_num}", results)
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def main():
|
||||
global running
|
||||
|
||||
parser = argparse.ArgumentParser(description="Find COG/SOG fields in Raymarine packets")
|
||||
parser.add_argument('-i', '--interface', required=True,
|
||||
help='Interface IP for Raymarine multicast (e.g., 198.18.5.5)')
|
||||
parser.add_argument('--cog-min', type=float, default=0,
|
||||
help='Minimum expected COG in degrees (default: 0)')
|
||||
parser.add_argument('--cog-max', type=float, default=359,
|
||||
help='Maximum expected COG in degrees (default: 359)')
|
||||
parser.add_argument('--sog-min', type=float, default=0,
|
||||
help='Minimum expected SOG in knots (default: 0)')
|
||||
parser.add_argument('--sog-max', type=float, default=2.0,
|
||||
help='Maximum expected SOG in knots (default: 2.0)')
|
||||
parser.add_argument('-n', '--count', type=int, default=5,
|
||||
help='Number of packets to analyze (default: 5)')
|
||||
parser.add_argument('--interval', type=float, default=1.0,
|
||||
help='Minimum interval between packets (default: 1.0)')
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
signal.signal(signal.SIGINT, signal_handler)
|
||||
signal.signal(signal.SIGTERM, signal_handler)
|
||||
|
||||
# Create sockets
|
||||
sockets = []
|
||||
for group, port in MULTICAST_GROUPS:
|
||||
try:
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
if hasattr(socket, 'SO_REUSEPORT'):
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
|
||||
sock.bind(('', port))
|
||||
mreq = struct.pack("4s4s", socket.inet_aton(group), socket.inet_aton(args.interface))
|
||||
sock.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, mreq)
|
||||
sock.setblocking(False)
|
||||
sockets.append((sock, group, port))
|
||||
except Exception as e:
|
||||
print(f"Error joining {group}:{port}: {e}")
|
||||
|
||||
if not sockets:
|
||||
print("Error: Could not join any multicast groups")
|
||||
sys.exit(1)
|
||||
|
||||
print(f"COG/SOG Field Finder")
|
||||
print(f"====================")
|
||||
print(f"Looking for COG: {args.cog_min}° - {args.cog_max}°")
|
||||
print(f"Looking for SOG: {args.sog_min} - {args.sog_max} kts")
|
||||
print(f"Analyzing {args.count} packets...")
|
||||
print()
|
||||
|
||||
# Track all field values across packets
|
||||
field_values: Dict[str, List[Tuple[float, Dict]]] = {}
|
||||
|
||||
packet_count = 0
|
||||
analyzed = 0
|
||||
last_time = 0
|
||||
|
||||
try:
|
||||
while running and analyzed < args.count:
|
||||
for sock, group, port in sockets:
|
||||
try:
|
||||
data, addr = sock.recvfrom(65535)
|
||||
packet_count += 1
|
||||
|
||||
now = time.time()
|
||||
if args.interval > 0 and (now - last_time) < args.interval:
|
||||
continue
|
||||
|
||||
results = scan_packet(data)
|
||||
if not results:
|
||||
continue
|
||||
|
||||
analyzed += 1
|
||||
last_time = now
|
||||
|
||||
# Collect values
|
||||
for r in results:
|
||||
path = r['path']
|
||||
if path not in field_values:
|
||||
field_values[path] = []
|
||||
field_values[path].append((r['raw'], r['interps']))
|
||||
|
||||
except BlockingIOError:
|
||||
continue
|
||||
|
||||
time.sleep(0.01)
|
||||
|
||||
finally:
|
||||
for sock, _, _ in sockets:
|
||||
sock.close()
|
||||
|
||||
# Analyze and display results
|
||||
print(f"\n{'='*80}")
|
||||
print(f"ANALYSIS - {analyzed} packets")
|
||||
print(f"{'='*80}")
|
||||
|
||||
cog_candidates = []
|
||||
sog_candidates = []
|
||||
other_fields = []
|
||||
|
||||
for path in sorted(field_values.keys()):
|
||||
values = field_values[path]
|
||||
raw_vals = [v[0] for v in values]
|
||||
|
||||
if not raw_vals:
|
||||
continue
|
||||
|
||||
min_raw = min(raw_vals)
|
||||
max_raw = max(raw_vals)
|
||||
avg_raw = sum(raw_vals) / len(raw_vals)
|
||||
|
||||
# Check if this could be COG (radians -> degrees in range)
|
||||
cog_matches = 0
|
||||
cog_degs = []
|
||||
for raw, interps in values:
|
||||
if 'deg' in interps:
|
||||
deg = interps['deg']
|
||||
cog_degs.append(deg)
|
||||
cog_match = check_cog_match(raw, args.cog_min, args.cog_max)
|
||||
if cog_match is not None:
|
||||
cog_matches += 1
|
||||
|
||||
# Check if this could be SOG (m/s -> knots in range)
|
||||
sog_matches = 0
|
||||
sog_kts = []
|
||||
for raw, interps in values:
|
||||
if 'kts' in interps:
|
||||
kts = interps['kts']
|
||||
sog_kts.append(kts)
|
||||
sog_match = check_sog_match(raw, args.sog_min, args.sog_max)
|
||||
if sog_match is not None:
|
||||
sog_matches += 1
|
||||
|
||||
# Categorize
|
||||
if cog_matches == len(values) and cog_degs:
|
||||
cog_candidates.append((path, cog_degs, raw_vals))
|
||||
elif sog_matches == len(values) and sog_kts:
|
||||
sog_candidates.append((path, sog_kts, raw_vals))
|
||||
else:
|
||||
other_fields.append((path, raw_vals, cog_degs, sog_kts))
|
||||
|
||||
# Print COG candidates
|
||||
print(f"\n*** POTENTIAL COG FIELDS (all {analyzed} samples matched {args.cog_min}°-{args.cog_max}°) ***")
|
||||
if cog_candidates:
|
||||
for path, degs, raws in cog_candidates:
|
||||
min_deg, max_deg = min(degs), max(degs)
|
||||
print(f" {path}: {min_deg:.1f}° - {max_deg:.1f}° (raw: {min(raws):.4f} - {max(raws):.4f} rad)")
|
||||
else:
|
||||
print(" (none found)")
|
||||
|
||||
# Print SOG candidates
|
||||
print(f"\n*** POTENTIAL SOG FIELDS (all {analyzed} samples matched {args.sog_min}-{args.sog_max} kts) ***")
|
||||
if sog_candidates:
|
||||
for path, kts_list, raws in sog_candidates:
|
||||
min_kts, max_kts = min(kts_list), max(kts_list)
|
||||
print(f" {path}: {min_kts:.2f} - {max_kts:.2f} kts (raw: {min(raws):.4f} - {max(raws):.4f} m/s)")
|
||||
else:
|
||||
print(" (none found)")
|
||||
|
||||
# Print other navigation-looking fields (small positive values)
|
||||
print(f"\n*** OTHER NUMERIC FIELDS (may be COG/SOG with different interpretation) ***")
|
||||
nav_fields = [(p, r, c, s) for p, r, c, s in other_fields
|
||||
if len(r) > 0 and 0 < min(r) < 100 and max(r) < 1000]
|
||||
|
||||
for path, raws, cog_degs, sog_kts in sorted(nav_fields, key=lambda x: x[0]):
|
||||
min_raw, max_raw = min(raws), max(raws)
|
||||
info = f" {path}: raw {min_raw:.4f} - {max_raw:.4f}"
|
||||
if cog_degs:
|
||||
info += f" | as deg: {min(cog_degs):.1f}° - {max(cog_degs):.1f}°"
|
||||
if sog_kts:
|
||||
info += f" | as kts: {min(sog_kts):.2f} - {max(sog_kts):.2f}"
|
||||
print(info)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
86
axiom-nmea/debug/find_consistent_heading.py
Normal file
86
axiom-nmea/debug/find_consistent_heading.py
Normal file
@@ -0,0 +1,86 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Find offsets that show consistent heading-like values in BOTH pcaps."""
|
||||
|
||||
import struct
|
||||
from collections import defaultdict
|
||||
|
||||
def decode_float(data, offset):
|
||||
if offset + 4 > len(data):
|
||||
return None
|
||||
try:
|
||||
val = struct.unpack('<f', data[offset:offset+4])[0]
|
||||
if val != val:
|
||||
return None
|
||||
return val
|
||||
except:
|
||||
return None
|
||||
|
||||
def read_pcap(filename):
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append(pkt_data[payload_start:])
|
||||
return packets
|
||||
|
||||
# Load both pcaps
|
||||
pcap1 = read_pcap("raymarine_sample.pcap")
|
||||
pcap2 = read_pcap("raymarine_sample_twd_69-73.pcap")
|
||||
|
||||
print("Looking for offsets where values differ significantly between captures")
|
||||
print("(indicating actual sensor data that changed, like heading)\n")
|
||||
|
||||
# Focus on 344 and 446 byte packets (most common with wind data)
|
||||
for target_size in [344, 446]:
|
||||
print(f"\n{'='*70}")
|
||||
print(f"PACKET SIZE: {target_size} bytes")
|
||||
print("=" * 70)
|
||||
|
||||
pkts1 = [p for p in pcap1 if len(p) == target_size][:10]
|
||||
pkts2 = [p for p in pcap2 if len(p) == target_size][:10]
|
||||
|
||||
if not pkts1 or not pkts2:
|
||||
print(" Not enough packets")
|
||||
continue
|
||||
|
||||
print(f"\n{'Offset':<10} {'Original pcap':<20} {'TWD pcap':<20} {'Diff':<10}")
|
||||
print("-" * 60)
|
||||
|
||||
# Check offsets from 0x50 to 0x180
|
||||
for offset in range(0x50, min(target_size - 4, 0x180)):
|
||||
vals1 = [decode_float(p, offset) for p in pkts1]
|
||||
vals2 = [decode_float(p, offset) for p in pkts2]
|
||||
|
||||
# Filter valid radian values (0 to 2*pi)
|
||||
degs1 = [v * 57.2958 for v in vals1 if v and 0 < v < 6.5]
|
||||
degs2 = [v * 57.2958 for v in vals2 if v and 0 < v < 6.5]
|
||||
|
||||
if not degs1 or not degs2:
|
||||
continue
|
||||
|
||||
avg1 = sum(degs1) / len(degs1)
|
||||
avg2 = sum(degs2) / len(degs2)
|
||||
diff = abs(avg2 - avg1)
|
||||
|
||||
# Look for offsets where:
|
||||
# 1. Values are valid angles (not 0, not garbage)
|
||||
# 2. Values changed between captures (diff > 5°)
|
||||
# 3. Both values are in reasonable heading range (0-360°)
|
||||
if 5 < avg1 < 355 and 5 < avg2 < 355 and diff > 5:
|
||||
print(f"0x{offset:04x} {avg1:6.1f}° ({len(degs1)} hits) {avg2:6.1f}° ({len(degs2)} hits) {diff:5.1f}°")
|
||||
127
axiom-nmea/debug/find_heading_vs_twd.py
Normal file
127
axiom-nmea/debug/find_heading_vs_twd.py
Normal file
@@ -0,0 +1,127 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Find heading vs TWD candidates.
|
||||
At anchor pointing into wind: heading and TWD should be within ~50° of each other.
|
||||
TWD expected: 69-73°, so heading could be ~20-120°
|
||||
"""
|
||||
|
||||
import struct
|
||||
from collections import defaultdict
|
||||
|
||||
def decode_float(data, offset):
|
||||
if offset + 4 > len(data):
|
||||
return None
|
||||
try:
|
||||
val = struct.unpack('<f', data[offset:offset+4])[0]
|
||||
if val != val: # NaN check
|
||||
return None
|
||||
return val
|
||||
except:
|
||||
return None
|
||||
|
||||
def read_pcap(filename):
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append(pkt_data[payload_start:])
|
||||
return packets
|
||||
|
||||
# Search for angles in a wider range around expected TWD (69-73°)
|
||||
# Heading at anchor into wind could be 20-120°
|
||||
TARGET_DEG_MIN = 15
|
||||
TARGET_DEG_MAX = 130
|
||||
TARGET_RAD_MIN = TARGET_DEG_MIN * 0.0174533
|
||||
TARGET_RAD_MAX = TARGET_DEG_MAX * 0.0174533
|
||||
|
||||
print("Reading raymarine_sample_twd_69-73.pcap...")
|
||||
packets = read_pcap("raymarine_sample_twd_69-73.pcap")
|
||||
print(f"Loaded {len(packets)} packets")
|
||||
print(f"\nSearching for angles {TARGET_DEG_MIN}-{TARGET_DEG_MAX}° (could be heading or TWD)")
|
||||
print("Expected: TWD ~69-73°, Heading within ±50° of that\n")
|
||||
|
||||
# Track candidates by offset
|
||||
candidates = defaultdict(list)
|
||||
|
||||
for pkt_idx, pkt in enumerate(packets):
|
||||
pkt_len = len(pkt)
|
||||
if pkt_len < 100:
|
||||
continue
|
||||
|
||||
for offset in range(0x50, min(pkt_len - 4, 0x200)):
|
||||
val = decode_float(pkt, offset)
|
||||
if val is None:
|
||||
continue
|
||||
|
||||
# Check if in target radian range
|
||||
if TARGET_RAD_MIN <= val <= TARGET_RAD_MAX:
|
||||
deg = val * 57.2958
|
||||
candidates[offset].append((pkt_idx, pkt_len, val, deg))
|
||||
|
||||
print("=" * 70)
|
||||
print("ANGLE CANDIDATES (heading or TWD)")
|
||||
print("Filtering for offsets with >20 hits in target packet sizes")
|
||||
print("=" * 70)
|
||||
|
||||
# Focus on packet sizes known to have wind data
|
||||
target_sizes = {344, 446, 788, 888, 931, 1031, 1472}
|
||||
|
||||
for offset in sorted(candidates.keys()):
|
||||
hits = candidates[offset]
|
||||
# Filter to target packet sizes
|
||||
target_hits = [(i, s, v, d) for i, s, v, d in hits if s in target_sizes]
|
||||
|
||||
if len(target_hits) < 10:
|
||||
continue
|
||||
|
||||
degs = [d for _, _, _, d in target_hits]
|
||||
pkt_sizes = sorted(set(s for _, s, _, _ in target_hits))
|
||||
|
||||
avg_deg = sum(degs) / len(degs)
|
||||
min_deg = min(degs)
|
||||
max_deg = max(degs)
|
||||
|
||||
# Categorize based on value
|
||||
if 65 <= avg_deg <= 80:
|
||||
category = "*** LIKELY TWD ***"
|
||||
elif 20 <= avg_deg <= 50 or 90 <= avg_deg <= 120:
|
||||
category = " (could be heading)"
|
||||
else:
|
||||
category = ""
|
||||
|
||||
print(f"\n 0x{offset:04x}: {len(target_hits):3d} hits, avg={avg_deg:5.1f}°, range={min_deg:.1f}°-{max_deg:.1f}° {category}")
|
||||
print(f" Sizes: {pkt_sizes}")
|
||||
|
||||
# Now compare offset 0x006b with nearby offsets
|
||||
print("\n" + "=" * 70)
|
||||
print("DETAILED COMPARISON: 0x006b vs nearby offsets")
|
||||
print("=" * 70)
|
||||
|
||||
check_offsets = [0x0066, 0x006b, 0x0070, 0x0075, 0x007a]
|
||||
|
||||
for pkt_len in [344, 446, 788, 888]:
|
||||
matching = [p for p in packets if len(p) == pkt_len][:3]
|
||||
if not matching:
|
||||
continue
|
||||
|
||||
print(f"\n {pkt_len} byte packets (first 3):")
|
||||
for off in check_offsets:
|
||||
vals = [decode_float(p, off) for p in matching]
|
||||
degs = [f"{v * 57.2958:.1f}°" if v and 0 <= v <= 6.5 else "---" for v in vals]
|
||||
print(f" 0x{off:04x}: {degs}")
|
||||
93
axiom-nmea/debug/find_twd.py
Normal file
93
axiom-nmea/debug/find_twd.py
Normal file
@@ -0,0 +1,93 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Search for True Wind Direction values in the 69-73 degree range."""
|
||||
|
||||
import struct
|
||||
from collections import defaultdict
|
||||
|
||||
# 69-73 degrees in radians
|
||||
TARGET_DEG_MIN = 66 # slightly wider range
|
||||
TARGET_DEG_MAX = 76
|
||||
TARGET_RAD_MIN = TARGET_DEG_MIN * 0.0174533 # ~1.15 rad
|
||||
TARGET_RAD_MAX = TARGET_DEG_MAX * 0.0174533 # ~1.33 rad
|
||||
|
||||
def decode_float(data, offset):
|
||||
if offset + 4 > len(data):
|
||||
return None
|
||||
try:
|
||||
val = struct.unpack('<f', data[offset:offset+4])[0]
|
||||
if val != val: # NaN check
|
||||
return None
|
||||
return val
|
||||
except:
|
||||
return None
|
||||
|
||||
def read_pcap(filename):
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append(pkt_data[payload_start:])
|
||||
return packets
|
||||
|
||||
print(f"Reading raymarine_sample_twd_69-73.pcap...")
|
||||
packets = read_pcap("raymarine_sample_twd_69-73.pcap")
|
||||
print(f"Loaded {len(packets)} packets\n")
|
||||
|
||||
print(f"Searching for direction values {TARGET_DEG_MIN}-{TARGET_DEG_MAX}° ({TARGET_RAD_MIN:.3f}-{TARGET_RAD_MAX:.3f} rad)\n")
|
||||
|
||||
# Track candidates by offset
|
||||
candidates = defaultdict(list)
|
||||
|
||||
for pkt_idx, pkt in enumerate(packets):
|
||||
pkt_len = len(pkt)
|
||||
if pkt_len < 100:
|
||||
continue
|
||||
|
||||
for offset in range(0x30, min(pkt_len - 4, 0x400)):
|
||||
val = decode_float(pkt, offset)
|
||||
if val is None:
|
||||
continue
|
||||
|
||||
# Check if in target radian range
|
||||
if TARGET_RAD_MIN <= val <= TARGET_RAD_MAX:
|
||||
deg = val * 57.2958
|
||||
candidates[offset].append((pkt_idx, pkt_len, val, deg))
|
||||
|
||||
print("=" * 70)
|
||||
print("WIND DIRECTION CANDIDATES (by offset, sorted by hit count)")
|
||||
print("=" * 70)
|
||||
|
||||
# Sort by number of hits
|
||||
sorted_offsets = sorted(candidates.keys(), key=lambda x: -len(candidates[x]))
|
||||
|
||||
for offset in sorted_offsets[:25]:
|
||||
hits = candidates[offset]
|
||||
values = [v for _, _, v, _ in hits]
|
||||
degs = [d for _, _, _, d in hits]
|
||||
pkt_sizes = sorted(set(s for _, s, _, _ in hits))
|
||||
|
||||
avg_rad = sum(values) / len(values)
|
||||
avg_deg = sum(degs) / len(degs)
|
||||
min_deg = min(degs)
|
||||
max_deg = max(degs)
|
||||
|
||||
print(f"\n Offset 0x{offset:04x}: {len(hits):4d} hits")
|
||||
print(f" Degrees: avg={avg_deg:.1f}°, range={min_deg:.1f}°-{max_deg:.1f}°")
|
||||
print(f" Radians: avg={avg_rad:.4f}")
|
||||
print(f" Packet sizes: {pkt_sizes[:8]}{'...' if len(pkt_sizes) > 8 else ''}")
|
||||
112
axiom-nmea/debug/find_twd_hdg.py
Normal file
112
axiom-nmea/debug/find_twd_hdg.py
Normal file
@@ -0,0 +1,112 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Find TWD and Heading offsets using known values:
|
||||
- TWD: 62-70°
|
||||
- Heading: 29-35°
|
||||
"""
|
||||
|
||||
import struct
|
||||
from collections import defaultdict
|
||||
|
||||
# Known ranges
|
||||
TWD_DEG_MIN, TWD_DEG_MAX = 60, 72 # Slightly wider
|
||||
HDG_DEG_MIN, HDG_DEG_MAX = 27, 37 # Slightly wider
|
||||
|
||||
TWD_RAD_MIN = TWD_DEG_MIN * 0.0174533
|
||||
TWD_RAD_MAX = TWD_DEG_MAX * 0.0174533
|
||||
HDG_RAD_MIN = HDG_DEG_MIN * 0.0174533
|
||||
HDG_RAD_MAX = HDG_DEG_MAX * 0.0174533
|
||||
|
||||
def decode_float(data, offset):
|
||||
if offset + 4 > len(data):
|
||||
return None
|
||||
try:
|
||||
val = struct.unpack('<f', data[offset:offset+4])[0]
|
||||
if val != val:
|
||||
return None
|
||||
return val
|
||||
except:
|
||||
return None
|
||||
|
||||
def read_pcap(filename):
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append(pkt_data[payload_start:])
|
||||
return packets
|
||||
|
||||
print("Reading raymarine_sample_TWD_62-70_HDG_29-35.pcap...")
|
||||
packets = read_pcap("raymarine_sample_TWD_62-70_HDG_29-35.pcap")
|
||||
print(f"Loaded {len(packets)} packets\n")
|
||||
|
||||
print(f"Searching for:")
|
||||
print(f" TWD: {TWD_DEG_MIN}-{TWD_DEG_MAX}° ({TWD_RAD_MIN:.3f}-{TWD_RAD_MAX:.3f} rad)")
|
||||
print(f" HDG: {HDG_DEG_MIN}-{HDG_DEG_MAX}° ({HDG_RAD_MIN:.3f}-{HDG_RAD_MAX:.3f} rad)\n")
|
||||
|
||||
# Track candidates
|
||||
twd_candidates = defaultdict(list)
|
||||
hdg_candidates = defaultdict(list)
|
||||
|
||||
target_sizes = {344, 446, 788, 888, 931, 1031, 1472}
|
||||
|
||||
for pkt_idx, pkt in enumerate(packets):
|
||||
pkt_len = len(pkt)
|
||||
if pkt_len not in target_sizes:
|
||||
continue
|
||||
|
||||
for offset in range(0x50, min(pkt_len - 4, 0x200)):
|
||||
val = decode_float(pkt, offset)
|
||||
if val is None:
|
||||
continue
|
||||
|
||||
deg = val * 57.2958
|
||||
|
||||
if TWD_RAD_MIN <= val <= TWD_RAD_MAX:
|
||||
twd_candidates[offset].append((pkt_len, deg))
|
||||
|
||||
if HDG_RAD_MIN <= val <= HDG_RAD_MAX:
|
||||
hdg_candidates[offset].append((pkt_len, deg))
|
||||
|
||||
print("=" * 70)
|
||||
print(f"TWD CANDIDATES ({TWD_DEG_MIN}-{TWD_DEG_MAX}°)")
|
||||
print("=" * 70)
|
||||
|
||||
for offset in sorted(twd_candidates.keys(), key=lambda x: -len(twd_candidates[x]))[:15]:
|
||||
hits = twd_candidates[offset]
|
||||
degs = [d for _, d in hits]
|
||||
sizes = sorted(set(s for s, _ in hits))
|
||||
avg = sum(degs) / len(degs)
|
||||
print(f" 0x{offset:04x}: {len(hits):3d} hits, avg={avg:.1f}°, range={min(degs):.1f}-{max(degs):.1f}°, sizes={sizes}")
|
||||
|
||||
print("\n" + "=" * 70)
|
||||
print(f"HEADING CANDIDATES ({HDG_DEG_MIN}-{HDG_DEG_MAX}°)")
|
||||
print("=" * 70)
|
||||
|
||||
for offset in sorted(hdg_candidates.keys(), key=lambda x: -len(hdg_candidates[x]))[:15]:
|
||||
hits = hdg_candidates[offset]
|
||||
degs = [d for _, d in hits]
|
||||
sizes = sorted(set(s for s, _ in hits))
|
||||
avg = sum(degs) / len(degs)
|
||||
print(f" 0x{offset:04x}: {len(hits):3d} hits, avg={avg:.1f}°, range={min(degs):.1f}-{max(degs):.1f}°, sizes={sizes}")
|
||||
|
||||
# Cross-check: find offsets that appear in both (shouldn't happen if ranges don't overlap)
|
||||
common = set(twd_candidates.keys()) & set(hdg_candidates.keys())
|
||||
if common:
|
||||
print(f"\n⚠️ Offsets appearing in both ranges: {[hex(o) for o in common]}")
|
||||
102
axiom-nmea/debug/find_twd_precise.py
Normal file
102
axiom-nmea/debug/find_twd_precise.py
Normal file
@@ -0,0 +1,102 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Find all float values in 69-73 degree range more precisely."""
|
||||
|
||||
import struct
|
||||
from collections import defaultdict
|
||||
|
||||
# Exact expected range: 69-73 degrees
|
||||
TARGET_DEG_MIN = 68
|
||||
TARGET_DEG_MAX = 74
|
||||
TARGET_RAD_MIN = TARGET_DEG_MIN * 0.0174533
|
||||
TARGET_RAD_MAX = TARGET_DEG_MAX * 0.0174533
|
||||
|
||||
def decode_float(data, offset):
|
||||
if offset + 4 > len(data):
|
||||
return None
|
||||
try:
|
||||
val = struct.unpack('<f', data[offset:offset+4])[0]
|
||||
if val != val: # NaN check
|
||||
return None
|
||||
return val
|
||||
except:
|
||||
return None
|
||||
|
||||
def read_pcap(filename):
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append(pkt_data[payload_start:])
|
||||
return packets
|
||||
|
||||
print(f"Reading raymarine_sample_twd_69-73.pcap...")
|
||||
packets = read_pcap("raymarine_sample_twd_69-73.pcap")
|
||||
print(f"Loaded {len(packets)} packets\n")
|
||||
|
||||
print(f"Searching PRECISELY for {TARGET_DEG_MIN}-{TARGET_DEG_MAX}° ({TARGET_RAD_MIN:.4f}-{TARGET_RAD_MAX:.4f} rad)\n")
|
||||
|
||||
# Track candidates by offset
|
||||
candidates = defaultdict(list)
|
||||
|
||||
for pkt_idx, pkt in enumerate(packets):
|
||||
pkt_len = len(pkt)
|
||||
if pkt_len < 100:
|
||||
continue
|
||||
|
||||
for offset in range(0x30, min(pkt_len - 4, 0x500)):
|
||||
val = decode_float(pkt, offset)
|
||||
if val is None:
|
||||
continue
|
||||
|
||||
# Check if in target radian range
|
||||
if TARGET_RAD_MIN <= val <= TARGET_RAD_MAX:
|
||||
deg = val * 57.2958
|
||||
candidates[offset].append((pkt_idx, pkt_len, val, deg))
|
||||
|
||||
print("=" * 70)
|
||||
print(f"OFFSETS WITH VALUES IN {TARGET_DEG_MIN}-{TARGET_DEG_MAX}° RANGE")
|
||||
print("=" * 70)
|
||||
|
||||
# Sort by number of hits
|
||||
sorted_offsets = sorted(candidates.keys(), key=lambda x: -len(candidates[x]))
|
||||
|
||||
for offset in sorted_offsets[:30]:
|
||||
hits = candidates[offset]
|
||||
values = [v for _, _, v, _ in hits]
|
||||
degs = [d for _, _, _, d in hits]
|
||||
pkt_sizes = sorted(set(s for _, s, _, _ in hits))
|
||||
|
||||
avg_deg = sum(degs) / len(degs)
|
||||
min_deg = min(degs)
|
||||
max_deg = max(degs)
|
||||
|
||||
print(f"\n 0x{offset:04x}: {len(hits):3d} hits, avg={avg_deg:.1f}°, range={min_deg:.1f}°-{max_deg:.1f}°")
|
||||
print(f" Packet sizes: {pkt_sizes}")
|
||||
|
||||
# Also show what's at offset 0x0070 since it had good results
|
||||
print("\n" + "=" * 70)
|
||||
print("DETAILED CHECK OF OFFSET 0x0070 (from earlier analysis)")
|
||||
print("=" * 70)
|
||||
|
||||
for pkt_len in [344, 446, 788, 888, 931, 1031, 1472]:
|
||||
matching = [p for p in packets if len(p) == pkt_len][:5]
|
||||
if matching:
|
||||
vals = [decode_float(p, 0x0070) for p in matching]
|
||||
degs = [v * 57.2958 if v else None for v in vals]
|
||||
print(f" {pkt_len} bytes: {[f'{d:.1f}°' if d else 'N/A' for d in degs]}")
|
||||
289
axiom-nmea/debug/packet_debug.py
Executable file
289
axiom-nmea/debug/packet_debug.py
Executable file
@@ -0,0 +1,289 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Raymarine Packet Debug Tool
|
||||
|
||||
Dumps raw protobuf field structure from Raymarine multicast packets.
|
||||
Use this to discover field locations for COG, SOG, and other data.
|
||||
|
||||
Usage:
|
||||
python packet_debug.py -i 198.18.5.5
|
||||
|
||||
The tool shows:
|
||||
- All top-level protobuf fields and their wire types
|
||||
- Nested field structures with decoded values
|
||||
- Float/double interpretations for potential navigation data
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import logging
|
||||
import os
|
||||
import signal
|
||||
import socket
|
||||
import struct
|
||||
import sys
|
||||
import time
|
||||
from datetime import datetime
|
||||
from typing import Dict, Any, Optional, List
|
||||
|
||||
# Add parent directory to path for library import
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from raymarine_nmea.protocol.parser import ProtobufParser, ProtoField
|
||||
from raymarine_nmea.protocol.constants import (
|
||||
WIRE_VARINT, WIRE_FIXED64, WIRE_LENGTH, WIRE_FIXED32,
|
||||
HEADER_SIZE, RAD_TO_DEG, MS_TO_KTS,
|
||||
)
|
||||
from raymarine_nmea.sensors import MULTICAST_GROUPS
|
||||
|
||||
# Known field names for reference
|
||||
FIELD_NAMES = {
|
||||
1: "DEVICE_INFO",
|
||||
2: "GPS_POSITION",
|
||||
3: "HEADING",
|
||||
7: "DEPTH",
|
||||
13: "WIND_NAVIGATION",
|
||||
14: "ENGINE_DATA",
|
||||
15: "TEMPERATURE",
|
||||
16: "TANK_DATA",
|
||||
20: "HOUSE_BATTERY",
|
||||
}
|
||||
|
||||
WIRE_TYPE_NAMES = {
|
||||
0: "varint",
|
||||
1: "fixed64",
|
||||
2: "length",
|
||||
5: "fixed32",
|
||||
}
|
||||
|
||||
running = True
|
||||
|
||||
|
||||
def signal_handler(signum, frame):
|
||||
global running
|
||||
running = False
|
||||
|
||||
|
||||
def decode_as_float(raw: bytes) -> Optional[float]:
|
||||
"""Try to decode bytes as float."""
|
||||
if len(raw) == 4:
|
||||
try:
|
||||
val = struct.unpack('<f', raw)[0]
|
||||
if val == val: # NaN check
|
||||
return val
|
||||
except struct.error:
|
||||
pass
|
||||
return None
|
||||
|
||||
|
||||
def decode_as_double(raw: bytes) -> Optional[float]:
|
||||
"""Try to decode bytes as double."""
|
||||
if len(raw) == 8:
|
||||
try:
|
||||
val = struct.unpack('<d', raw)[0]
|
||||
if val == val: # NaN check
|
||||
return val
|
||||
except struct.error:
|
||||
pass
|
||||
return None
|
||||
|
||||
|
||||
def format_value(pf: ProtoField, indent: int = 0) -> List[str]:
|
||||
"""Format a protobuf field for display."""
|
||||
lines = []
|
||||
prefix = " " * indent
|
||||
wire_name = WIRE_TYPE_NAMES.get(pf.wire_type, f"wire{pf.wire_type}")
|
||||
field_name = FIELD_NAMES.get(pf.field_num, "")
|
||||
if field_name:
|
||||
field_name = f" ({field_name})"
|
||||
|
||||
if pf.wire_type == WIRE_VARINT:
|
||||
lines.append(f"{prefix}Field {pf.field_num}{field_name}: {pf.value} [{wire_name}]")
|
||||
|
||||
elif pf.wire_type == WIRE_FIXED32:
|
||||
fval = decode_as_float(pf.value)
|
||||
hex_str = pf.value.hex()
|
||||
if fval is not None:
|
||||
# Show various interpretations
|
||||
deg = fval * RAD_TO_DEG if 0 <= fval <= 6.5 else None
|
||||
kts = fval * MS_TO_KTS if 0 <= fval <= 50 else None
|
||||
interp = []
|
||||
if deg is not None and 0 <= deg <= 360:
|
||||
interp.append(f"{deg:.1f}°")
|
||||
if kts is not None and 0 <= kts <= 100:
|
||||
interp.append(f"{kts:.1f}kts")
|
||||
interp_str = f" -> {', '.join(interp)}" if interp else ""
|
||||
lines.append(f"{prefix}Field {pf.field_num}{field_name}: {fval:.6f} [{wire_name}] 0x{hex_str}{interp_str}")
|
||||
else:
|
||||
lines.append(f"{prefix}Field {pf.field_num}{field_name}: 0x{hex_str} [{wire_name}]")
|
||||
|
||||
elif pf.wire_type == WIRE_FIXED64:
|
||||
dval = decode_as_double(pf.value)
|
||||
hex_str = pf.value.hex()
|
||||
if dval is not None:
|
||||
lines.append(f"{prefix}Field {pf.field_num}{field_name}: {dval:.8f} [{wire_name}]")
|
||||
else:
|
||||
lines.append(f"{prefix}Field {pf.field_num}{field_name}: 0x{hex_str} [{wire_name}]")
|
||||
|
||||
elif pf.wire_type == WIRE_LENGTH:
|
||||
data_len = len(pf.value) if isinstance(pf.value, bytes) else 0
|
||||
lines.append(f"{prefix}Field {pf.field_num}{field_name}: [{wire_name}, {data_len} bytes]")
|
||||
if pf.children:
|
||||
for child_num in sorted(pf.children.keys()):
|
||||
child = pf.children[child_num]
|
||||
lines.extend(format_value(child, indent + 1))
|
||||
elif data_len <= 32:
|
||||
lines.append(f"{prefix} Raw: 0x{pf.value.hex()}")
|
||||
|
||||
return lines
|
||||
|
||||
|
||||
def dump_packet(packet: bytes, packet_num: int, filter_fields: Optional[set] = None):
|
||||
"""Dump a single packet's protobuf structure."""
|
||||
if len(packet) < HEADER_SIZE + 10:
|
||||
return
|
||||
|
||||
proto_data = packet[HEADER_SIZE:]
|
||||
parser = ProtobufParser(proto_data)
|
||||
|
||||
# Collect repeated fields that we know about
|
||||
fields = parser.parse_message(collect_repeated={14, 16, 20})
|
||||
|
||||
if not fields:
|
||||
return
|
||||
|
||||
# Filter if requested
|
||||
if filter_fields:
|
||||
fields = {k: v for k, v in fields.items() if k in filter_fields}
|
||||
if not fields:
|
||||
return
|
||||
|
||||
timestamp = datetime.now().strftime('%H:%M:%S.%f')[:-3]
|
||||
print(f"\n{'='*70}")
|
||||
print(f"Packet #{packet_num} at {timestamp} ({len(packet)} bytes)")
|
||||
print(f"{'='*70}")
|
||||
|
||||
for field_num in sorted(fields.keys()):
|
||||
val = fields[field_num]
|
||||
# Handle repeated fields (stored as list)
|
||||
if isinstance(val, list):
|
||||
for i, pf in enumerate(val):
|
||||
lines = format_value(pf)
|
||||
for line in lines:
|
||||
# Add index to first line for repeated fields
|
||||
if lines.index(line) == 0:
|
||||
print(f"{line} [{i}]")
|
||||
else:
|
||||
print(line)
|
||||
else:
|
||||
for line in format_value(val):
|
||||
print(line)
|
||||
|
||||
|
||||
def main():
|
||||
global running
|
||||
|
||||
parser = argparse.ArgumentParser(description="Debug Raymarine packet structure")
|
||||
parser.add_argument('-i', '--interface', required=True,
|
||||
help='Interface IP for Raymarine multicast (e.g., 198.18.5.5)')
|
||||
parser.add_argument('-n', '--count', type=int, default=0,
|
||||
help='Number of packets to capture (0 = unlimited)')
|
||||
parser.add_argument('-f', '--fields', type=str, default='',
|
||||
help='Comma-separated field numbers to show (empty = all)')
|
||||
parser.add_argument('--interval', type=float, default=0.0,
|
||||
help='Minimum interval between packets shown (seconds)')
|
||||
parser.add_argument('--nav-only', action='store_true',
|
||||
help='Show only navigation-related fields (2,3,13)')
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Parse field filter
|
||||
filter_fields = None
|
||||
if args.nav_only:
|
||||
filter_fields = {2, 3, 13} # GPS, Heading, Wind/Nav
|
||||
elif args.fields:
|
||||
try:
|
||||
filter_fields = set(int(x.strip()) for x in args.fields.split(','))
|
||||
except ValueError:
|
||||
print("Error: --fields must be comma-separated integers")
|
||||
sys.exit(1)
|
||||
|
||||
signal.signal(signal.SIGINT, signal_handler)
|
||||
signal.signal(signal.SIGTERM, signal_handler)
|
||||
|
||||
# Create sockets for all multicast groups
|
||||
sockets = []
|
||||
for group, port in MULTICAST_GROUPS:
|
||||
try:
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
if hasattr(socket, 'SO_REUSEPORT'):
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
|
||||
sock.bind(('', port))
|
||||
|
||||
# Join multicast group using struct.pack like MulticastListener does
|
||||
mreq = struct.pack("4s4s",
|
||||
socket.inet_aton(group),
|
||||
socket.inet_aton(args.interface)
|
||||
)
|
||||
sock.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, mreq)
|
||||
sock.setblocking(False)
|
||||
sockets.append((sock, group, port))
|
||||
print(f"Joined {group}:{port}")
|
||||
except Exception as e:
|
||||
print(f"Error joining {group}:{port}: {e}")
|
||||
|
||||
if not sockets:
|
||||
print("Error: Could not join any multicast groups")
|
||||
sys.exit(1)
|
||||
|
||||
print(f"\nRaymarine Packet Debug Tool")
|
||||
print(f"Listening on {args.interface}")
|
||||
if filter_fields:
|
||||
print(f"Showing fields: {sorted(filter_fields)}")
|
||||
print(f"Press Ctrl+C to stop\n")
|
||||
|
||||
print("Known field numbers:")
|
||||
for num, name in sorted(FIELD_NAMES.items()):
|
||||
print(f" {num}: {name}")
|
||||
print()
|
||||
|
||||
packet_count = 0
|
||||
last_dump = 0
|
||||
|
||||
try:
|
||||
while running:
|
||||
# Poll all sockets
|
||||
for sock, group, port in sockets:
|
||||
try:
|
||||
data, addr = sock.recvfrom(65535)
|
||||
packet_count += 1
|
||||
|
||||
# Rate limiting
|
||||
now = time.time()
|
||||
if args.interval > 0 and (now - last_dump) < args.interval:
|
||||
continue
|
||||
|
||||
dump_packet(data, packet_count, filter_fields)
|
||||
last_dump = now
|
||||
|
||||
# Count limit
|
||||
if args.count > 0 and packet_count >= args.count:
|
||||
running = False
|
||||
break
|
||||
|
||||
except BlockingIOError:
|
||||
continue
|
||||
except Exception as e:
|
||||
if running:
|
||||
print(f"Error on {group}:{port}: {e}")
|
||||
|
||||
time.sleep(0.01) # Small sleep to avoid busy-waiting
|
||||
|
||||
finally:
|
||||
for sock, _, _ in sockets:
|
||||
sock.close()
|
||||
print(f"\n\nCaptured {packet_count} packets")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
346
axiom-nmea/debug/pressure_finder.py
Executable file
346
axiom-nmea/debug/pressure_finder.py
Executable file
@@ -0,0 +1,346 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Pressure Finder - Locate barometric pressure data in Raymarine protobuf stream.
|
||||
|
||||
Scans for float values that could be pressure in various units.
|
||||
Uses known pressure value to correlate field locations.
|
||||
|
||||
Usage:
|
||||
python pressure_finder.py -i YOUR_INTERFACE_IP -p 1021 # Known pressure in mbar
|
||||
"""
|
||||
|
||||
import struct
|
||||
import socket
|
||||
import time
|
||||
import argparse
|
||||
from typing import Dict, List, Any, Optional
|
||||
from collections import defaultdict
|
||||
|
||||
WIRE_VARINT = 0
|
||||
WIRE_FIXED64 = 1
|
||||
WIRE_LENGTH = 2
|
||||
WIRE_FIXED32 = 5
|
||||
|
||||
HEADER_SIZE = 20
|
||||
|
||||
MULTICAST_GROUPS = [
|
||||
("226.192.206.102", 2565), # Main sensor data
|
||||
("239.2.1.1", 2154), # May contain additional data
|
||||
]
|
||||
|
||||
|
||||
class ProtobufParser:
|
||||
"""Parse protobuf without schema."""
|
||||
|
||||
def __init__(self, data: bytes):
|
||||
self.data = data
|
||||
self.pos = 0
|
||||
|
||||
def remaining(self):
|
||||
return len(self.data) - self.pos
|
||||
|
||||
def read_varint(self) -> int:
|
||||
result = 0
|
||||
shift = 0
|
||||
while self.pos < len(self.data):
|
||||
byte = self.data[self.pos]
|
||||
self.pos += 1
|
||||
result |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
return result
|
||||
|
||||
def parse_nested_deep(self, data: bytes, path: str = "", depth: int = 0, max_depth: int = 5) -> List[tuple]:
|
||||
"""Recursively parse nested protobuf and return list of (path, type, value) tuples."""
|
||||
results = []
|
||||
pos = 0
|
||||
|
||||
if depth > max_depth:
|
||||
return results
|
||||
|
||||
while pos < len(data):
|
||||
if pos >= len(data):
|
||||
break
|
||||
try:
|
||||
# Read tag
|
||||
tag_byte = data[pos]
|
||||
pos += 1
|
||||
|
||||
# Handle multi-byte varints for tag
|
||||
tag = tag_byte & 0x7F
|
||||
shift = 7
|
||||
while tag_byte & 0x80 and pos < len(data):
|
||||
tag_byte = data[pos]
|
||||
pos += 1
|
||||
tag |= (tag_byte & 0x7F) << shift
|
||||
shift += 7
|
||||
|
||||
field_num = tag >> 3
|
||||
wire_type = tag & 0x07
|
||||
|
||||
if field_num == 0 or field_num > 100:
|
||||
break
|
||||
|
||||
field_path = f"{path}.{field_num}" if path else str(field_num)
|
||||
|
||||
if wire_type == WIRE_VARINT:
|
||||
val = 0
|
||||
shift = 0
|
||||
while pos < len(data):
|
||||
byte = data[pos]
|
||||
pos += 1
|
||||
val |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
results.append((field_path, 'varint', val))
|
||||
|
||||
elif wire_type == WIRE_FIXED32:
|
||||
raw = data[pos:pos + 4]
|
||||
pos += 4
|
||||
try:
|
||||
f = struct.unpack('<f', raw)[0]
|
||||
if f == f: # not NaN
|
||||
results.append((field_path, 'float', f))
|
||||
except:
|
||||
pass
|
||||
|
||||
elif wire_type == WIRE_FIXED64:
|
||||
raw = data[pos:pos + 8]
|
||||
pos += 8
|
||||
try:
|
||||
d = struct.unpack('<d', raw)[0]
|
||||
if d == d: # not NaN
|
||||
results.append((field_path, 'double', d))
|
||||
except:
|
||||
pass
|
||||
|
||||
elif wire_type == WIRE_LENGTH:
|
||||
length = 0
|
||||
shift = 0
|
||||
while pos < len(data):
|
||||
byte = data[pos]
|
||||
pos += 1
|
||||
length |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
raw = data[pos:pos + length]
|
||||
pos += length
|
||||
|
||||
# Try to parse as nested message
|
||||
if len(raw) >= 2:
|
||||
nested_results = self.parse_nested_deep(raw, field_path, depth + 1, max_depth)
|
||||
if nested_results:
|
||||
results.extend(nested_results)
|
||||
else:
|
||||
break
|
||||
|
||||
except Exception:
|
||||
break
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def is_pressure_like(val: float, target_mbar: float) -> Optional[str]:
|
||||
"""Check if a float value could be barometric pressure.
|
||||
|
||||
Checks multiple unit possibilities and returns match description.
|
||||
"""
|
||||
tolerance = 0.02 # 2% tolerance
|
||||
|
||||
# Convert target to various units
|
||||
target_pa = target_mbar * 100 # Pascals (1021 mbar = 102100 Pa)
|
||||
target_hpa = target_mbar # hPa = mbar
|
||||
target_kpa = target_mbar / 10 # kPa (102.1)
|
||||
target_bar = target_mbar / 1000 # bar (1.021)
|
||||
target_inhg = target_mbar * 0.02953 # inHg (~30.15)
|
||||
target_psi = target_mbar * 0.0145 # PSI (~14.8)
|
||||
|
||||
checks = [
|
||||
(target_mbar, "mbar (direct)"),
|
||||
(target_hpa, "hPa"),
|
||||
(target_pa, "Pascals"),
|
||||
(target_kpa, "kPa"),
|
||||
(target_bar, "bar"),
|
||||
(target_inhg, "inHg"),
|
||||
(target_psi, "PSI"),
|
||||
]
|
||||
|
||||
for target, unit in checks:
|
||||
if target > 0 and abs(val - target) / target < tolerance:
|
||||
return unit
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def scan_packet(data: bytes, target_mbar: float, group: str, port: int,
|
||||
candidates: Dict[str, Dict]) -> None:
|
||||
"""Scan packet for pressure-like values."""
|
||||
if len(data) < HEADER_SIZE + 5:
|
||||
return
|
||||
|
||||
proto_data = data[HEADER_SIZE:]
|
||||
parser = ProtobufParser(proto_data)
|
||||
|
||||
# Get all fields recursively
|
||||
all_fields = parser.parse_all_fields() if hasattr(parser, 'parse_all_fields') else {}
|
||||
|
||||
# Deep parse all length-delimited fields
|
||||
parser = ProtobufParser(proto_data)
|
||||
parser.pos = 0
|
||||
all_results = parser.parse_nested_deep(proto_data, "")
|
||||
|
||||
for path, vtype, value in all_results:
|
||||
if vtype in ('float', 'double'):
|
||||
unit = is_pressure_like(value, target_mbar)
|
||||
if unit:
|
||||
if path not in candidates:
|
||||
candidates[path] = {
|
||||
'values': [],
|
||||
'unit': unit,
|
||||
'type': vtype,
|
||||
'count': 0
|
||||
}
|
||||
candidates[path]['values'].append(value)
|
||||
candidates[path]['count'] += 1
|
||||
|
||||
|
||||
def parse_all_fields(self) -> Dict[int, List[Any]]:
|
||||
"""Parse and collect all fields."""
|
||||
fields = {}
|
||||
|
||||
while self.pos < len(self.data):
|
||||
if self.remaining() < 1:
|
||||
break
|
||||
try:
|
||||
tag = self.read_varint()
|
||||
field_num = tag >> 3
|
||||
wire_type = tag & 0x07
|
||||
|
||||
if field_num == 0 or field_num > 1000:
|
||||
break
|
||||
|
||||
if wire_type == WIRE_VARINT:
|
||||
value = ('varint', self.read_varint())
|
||||
elif wire_type == WIRE_FIXED64:
|
||||
raw = self.data[self.pos:self.pos + 8]
|
||||
self.pos += 8
|
||||
try:
|
||||
d = struct.unpack('<d', raw)[0]
|
||||
value = ('double', d)
|
||||
except:
|
||||
value = ('fixed64', raw.hex())
|
||||
elif wire_type == WIRE_LENGTH:
|
||||
length = self.read_varint()
|
||||
raw = self.data[self.pos:self.pos + length]
|
||||
self.pos += length
|
||||
value = ('length', raw)
|
||||
elif wire_type == WIRE_FIXED32:
|
||||
raw = self.data[self.pos:self.pos + 4]
|
||||
self.pos += 4
|
||||
try:
|
||||
f = struct.unpack('<f', raw)[0]
|
||||
value = ('float', f)
|
||||
except:
|
||||
value = ('fixed32', raw.hex())
|
||||
else:
|
||||
break
|
||||
|
||||
if field_num not in fields:
|
||||
fields[field_num] = []
|
||||
fields[field_num].append(value)
|
||||
|
||||
except:
|
||||
break
|
||||
|
||||
return fields
|
||||
|
||||
|
||||
ProtobufParser.parse_all_fields = parse_all_fields
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(description="Find barometric pressure field in Raymarine data")
|
||||
parser.add_argument('-i', '--interface', required=True, help='Interface IP')
|
||||
parser.add_argument('-p', '--pressure', type=float, required=True,
|
||||
help='Known current pressure in mbar (e.g., 1021)')
|
||||
parser.add_argument('-t', '--time', type=int, default=10, help='Capture time (seconds)')
|
||||
parser.add_argument('-g', '--group', default="226.192.206.102", help='Multicast group')
|
||||
parser.add_argument('--port', type=int, default=2565, help='UDP Port')
|
||||
args = parser.parse_args()
|
||||
|
||||
print(f"Pressure Finder - Looking for {args.pressure} mbar")
|
||||
print("=" * 60)
|
||||
print(f"Target values to find:")
|
||||
print(f" mbar/hPa: {args.pressure:.1f}")
|
||||
print(f" Pascals: {args.pressure * 100:.0f}")
|
||||
print(f" kPa: {args.pressure / 10:.2f}")
|
||||
print(f" inHg: {args.pressure * 0.02953:.2f}")
|
||||
print(f" bar: {args.pressure / 1000:.4f}")
|
||||
print("=" * 60)
|
||||
print(f"\nListening on {args.group}:{args.port} for {args.time} seconds...")
|
||||
|
||||
# Create socket
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
if hasattr(socket, 'SO_REUSEPORT'):
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
|
||||
sock.bind(('', args.port))
|
||||
mreq = struct.pack("4s4s", socket.inet_aton(args.group), socket.inet_aton(args.interface))
|
||||
sock.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, mreq)
|
||||
sock.settimeout(1.0)
|
||||
|
||||
candidates = {} # field_path -> info
|
||||
packet_count = 0
|
||||
end_time = time.time() + args.time
|
||||
|
||||
try:
|
||||
while time.time() < end_time:
|
||||
try:
|
||||
data, _ = sock.recvfrom(65535)
|
||||
packet_count += 1
|
||||
scan_packet(data, args.pressure, args.group, args.port, candidates)
|
||||
except socket.timeout:
|
||||
continue
|
||||
except KeyboardInterrupt:
|
||||
pass
|
||||
finally:
|
||||
sock.close()
|
||||
|
||||
print(f"\n\nResults after scanning {packet_count} packets:")
|
||||
print("=" * 60)
|
||||
|
||||
if not candidates:
|
||||
print("No pressure-like values found!")
|
||||
print("\nSuggestions:")
|
||||
print("1. Verify the pressure sensor is connected and broadcasting")
|
||||
print("2. Try different multicast groups (239.2.1.1:2154)")
|
||||
print("3. Check if pressure is in a different packet size")
|
||||
else:
|
||||
print(f"\nFound {len(candidates)} candidate field(s):\n")
|
||||
|
||||
# Sort by count (most frequent first)
|
||||
for path, info in sorted(candidates.items(), key=lambda x: -x[1]['count']):
|
||||
values = info['values']
|
||||
avg_val = sum(values) / len(values)
|
||||
min_val = min(values)
|
||||
max_val = max(values)
|
||||
|
||||
print(f"Field {path}:")
|
||||
print(f" Type: {info['type']}")
|
||||
print(f" Matches: {info['count']} packets")
|
||||
print(f" Unit likely: {info['unit']}")
|
||||
print(f" Values: min={min_val:.2f}, max={max_val:.2f}, avg={avg_val:.2f}")
|
||||
if info['unit'] == 'Pascals':
|
||||
print(f" As mbar: {avg_val/100:.1f} mbar")
|
||||
elif info['unit'] == 'kPa':
|
||||
print(f" As mbar: {avg_val*10:.1f} mbar")
|
||||
print()
|
||||
|
||||
print("\nDone.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
1042
axiom-nmea/debug/protobuf_decoder.py
Normal file
1042
axiom-nmea/debug/protobuf_decoder.py
Normal file
File diff suppressed because it is too large
Load Diff
331
axiom-nmea/debug/protobuf_parser.py
Normal file
331
axiom-nmea/debug/protobuf_parser.py
Normal file
@@ -0,0 +1,331 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Proper protobuf wire format parser for Raymarine packets.
|
||||
Decodes the nested message structure to understand the protocol.
|
||||
"""
|
||||
|
||||
import struct
|
||||
from dataclasses import dataclass
|
||||
from typing import List, Tuple, Optional, Any
|
||||
|
||||
# Wire types
|
||||
WIRE_VARINT = 0 # int32, int64, uint32, uint64, sint32, sint64, bool, enum
|
||||
WIRE_FIXED64 = 1 # fixed64, sfixed64, double
|
||||
WIRE_LENGTH = 2 # string, bytes, embedded messages, packed repeated fields
|
||||
WIRE_FIXED32 = 5 # fixed32, sfixed32, float
|
||||
|
||||
WIRE_NAMES = {
|
||||
0: 'varint',
|
||||
1: 'fixed64',
|
||||
2: 'length-delim',
|
||||
5: 'fixed32',
|
||||
}
|
||||
|
||||
@dataclass
|
||||
class ProtoField:
|
||||
"""Represents a decoded protobuf field."""
|
||||
field_num: int
|
||||
wire_type: int
|
||||
offset: int
|
||||
length: int
|
||||
raw_value: bytes
|
||||
decoded_value: Any = None
|
||||
children: List['ProtoField'] = None
|
||||
|
||||
def __post_init__(self):
|
||||
if self.children is None:
|
||||
self.children = []
|
||||
|
||||
|
||||
class ProtobufDecoder:
|
||||
"""Decodes protobuf wire format without a schema."""
|
||||
|
||||
def __init__(self, data: bytes):
|
||||
self.data = data
|
||||
self.pos = 0
|
||||
|
||||
def decode_varint(self) -> Tuple[int, int]:
|
||||
"""Decode a varint, return (value, bytes_consumed)."""
|
||||
result = 0
|
||||
shift = 0
|
||||
start = self.pos
|
||||
|
||||
while self.pos < len(self.data):
|
||||
byte = self.data[self.pos]
|
||||
self.pos += 1
|
||||
result |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
if shift > 63:
|
||||
break
|
||||
|
||||
return result, self.pos - start
|
||||
|
||||
def decode_fixed64(self) -> bytes:
|
||||
"""Decode 8 bytes (fixed64/double)."""
|
||||
value = self.data[self.pos:self.pos + 8]
|
||||
self.pos += 8
|
||||
return value
|
||||
|
||||
def decode_fixed32(self) -> bytes:
|
||||
"""Decode 4 bytes (fixed32/float)."""
|
||||
value = self.data[self.pos:self.pos + 4]
|
||||
self.pos += 4
|
||||
return value
|
||||
|
||||
def decode_length_delimited(self) -> bytes:
|
||||
"""Decode length-delimited field (string, bytes, nested message)."""
|
||||
length, _ = self.decode_varint()
|
||||
value = self.data[self.pos:self.pos + length]
|
||||
self.pos += length
|
||||
return value
|
||||
|
||||
def decode_field(self) -> Optional[ProtoField]:
|
||||
"""Decode a single protobuf field."""
|
||||
if self.pos >= len(self.data):
|
||||
return None
|
||||
|
||||
start_offset = self.pos
|
||||
|
||||
# Decode tag
|
||||
tag, _ = self.decode_varint()
|
||||
field_num = tag >> 3
|
||||
wire_type = tag & 0x07
|
||||
|
||||
# Sanity check
|
||||
if field_num == 0 or field_num > 536870911: # Max field number
|
||||
return None
|
||||
|
||||
try:
|
||||
if wire_type == WIRE_VARINT:
|
||||
value, _ = self.decode_varint()
|
||||
raw = self.data[start_offset:self.pos]
|
||||
return ProtoField(field_num, wire_type, start_offset,
|
||||
self.pos - start_offset, raw, value)
|
||||
|
||||
elif wire_type == WIRE_FIXED64:
|
||||
raw = self.decode_fixed64()
|
||||
# Try to decode as double
|
||||
try:
|
||||
double_val = struct.unpack('<d', raw)[0]
|
||||
decoded = double_val
|
||||
except:
|
||||
decoded = raw
|
||||
return ProtoField(field_num, wire_type, start_offset,
|
||||
self.pos - start_offset, raw, decoded)
|
||||
|
||||
elif wire_type == WIRE_LENGTH:
|
||||
raw = self.decode_length_delimited()
|
||||
return ProtoField(field_num, wire_type, start_offset,
|
||||
self.pos - start_offset, raw, raw)
|
||||
|
||||
elif wire_type == WIRE_FIXED32:
|
||||
raw = self.decode_fixed32()
|
||||
# Try to decode as float
|
||||
try:
|
||||
float_val = struct.unpack('<f', raw)[0]
|
||||
decoded = float_val
|
||||
except:
|
||||
decoded = raw
|
||||
return ProtoField(field_num, wire_type, start_offset,
|
||||
self.pos - start_offset, raw, decoded)
|
||||
|
||||
else:
|
||||
# Unknown wire type
|
||||
return None
|
||||
|
||||
except (IndexError, struct.error):
|
||||
return None
|
||||
|
||||
def decode_all(self) -> List[ProtoField]:
|
||||
"""Decode all fields in the buffer."""
|
||||
fields = []
|
||||
while self.pos < len(self.data):
|
||||
field = self.decode_field()
|
||||
if field is None:
|
||||
break
|
||||
fields.append(field)
|
||||
return fields
|
||||
|
||||
|
||||
def try_decode_nested(field: ProtoField, depth: int = 0, max_depth: int = 5) -> bool:
|
||||
"""Try to decode a length-delimited field as a nested message."""
|
||||
if field.wire_type != WIRE_LENGTH or depth >= max_depth:
|
||||
return False
|
||||
|
||||
if len(field.raw_value) < 2:
|
||||
return False
|
||||
|
||||
# Try to decode as nested protobuf
|
||||
decoder = ProtobufDecoder(field.raw_value)
|
||||
children = []
|
||||
|
||||
try:
|
||||
while decoder.pos < len(decoder.data):
|
||||
child = decoder.decode_field()
|
||||
if child is None:
|
||||
break
|
||||
# Recursively try to decode nested messages
|
||||
if child.wire_type == WIRE_LENGTH:
|
||||
try_decode_nested(child, depth + 1, max_depth)
|
||||
children.append(child)
|
||||
|
||||
# Only consider it a valid nested message if we decoded most of the data
|
||||
if children and decoder.pos >= len(decoder.data) * 0.8:
|
||||
field.children = children
|
||||
return True
|
||||
except:
|
||||
pass
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def print_field(field: ProtoField, indent: int = 0, show_raw: bool = False):
|
||||
"""Pretty print a protobuf field."""
|
||||
prefix = " " * indent
|
||||
wire_name = WIRE_NAMES.get(field.wire_type, f'unknown({field.wire_type})')
|
||||
|
||||
# Format value based on type
|
||||
if field.wire_type == WIRE_VARINT:
|
||||
value_str = f"{field.decoded_value}"
|
||||
elif field.wire_type == WIRE_FIXED64:
|
||||
if isinstance(field.decoded_value, float):
|
||||
value_str = f"{field.decoded_value:.6f}"
|
||||
# Check if it could be coordinates
|
||||
if -90 <= field.decoded_value <= 90:
|
||||
value_str += " (could be lat)"
|
||||
elif -180 <= field.decoded_value <= 180:
|
||||
value_str += " (could be lon)"
|
||||
else:
|
||||
value_str = field.raw_value.hex()
|
||||
elif field.wire_type == WIRE_FIXED32:
|
||||
if isinstance(field.decoded_value, float):
|
||||
value_str = f"{field.decoded_value:.4f}"
|
||||
# Check if it could be angle in radians
|
||||
if 0 <= field.decoded_value <= 6.5:
|
||||
deg = field.decoded_value * 57.2958
|
||||
value_str += f" ({deg:.1f}°)"
|
||||
else:
|
||||
value_str = field.raw_value.hex()
|
||||
elif field.wire_type == WIRE_LENGTH:
|
||||
if field.children:
|
||||
value_str = f"[nested message, {len(field.children)} fields]"
|
||||
else:
|
||||
# Try to show as string if printable
|
||||
try:
|
||||
text = field.raw_value.decode('ascii')
|
||||
if all(32 <= ord(c) < 127 or c in '\n\r\t' for c in text):
|
||||
value_str = f'"{text}"'
|
||||
else:
|
||||
value_str = f"[{len(field.raw_value)} bytes]"
|
||||
except:
|
||||
value_str = f"[{len(field.raw_value)} bytes]"
|
||||
else:
|
||||
value_str = field.raw_value.hex()
|
||||
|
||||
print(f"{prefix}field {field.field_num:2d} ({wire_name:12s}) @ 0x{field.offset:04x}: {value_str}")
|
||||
|
||||
if show_raw and field.wire_type == WIRE_LENGTH and not field.children:
|
||||
# Show hex dump for non-nested length-delimited fields
|
||||
hex_str = ' '.join(f'{b:02x}' for b in field.raw_value[:32])
|
||||
if len(field.raw_value) > 32:
|
||||
hex_str += ' ...'
|
||||
print(f"{prefix} raw: {hex_str}")
|
||||
|
||||
# Print children
|
||||
for child in field.children:
|
||||
print_field(child, indent + 1, show_raw)
|
||||
|
||||
|
||||
def analyze_packet(data: bytes, show_raw: bool = False):
|
||||
"""Analyze a single packet."""
|
||||
print(f"\nPacket length: {len(data)} bytes")
|
||||
|
||||
# Check for fixed header
|
||||
if len(data) > 20:
|
||||
header = data[:20]
|
||||
print(f"Header (first 20 bytes): {header.hex()}")
|
||||
|
||||
# Look for protobuf start (usually 0x0a = field 1, length-delimited)
|
||||
proto_start = None
|
||||
for i in range(0, min(24, len(data))):
|
||||
if data[i] == 0x0a: # field 1, wire type 2
|
||||
proto_start = i
|
||||
break
|
||||
|
||||
if proto_start is not None:
|
||||
print(f"Protobuf likely starts at offset 0x{proto_start:04x}")
|
||||
proto_data = data[proto_start:]
|
||||
else:
|
||||
print("No clear protobuf start found, trying from offset 0")
|
||||
proto_data = data
|
||||
else:
|
||||
proto_data = data
|
||||
|
||||
# Decode protobuf
|
||||
decoder = ProtobufDecoder(proto_data)
|
||||
fields = decoder.decode_all()
|
||||
|
||||
print(f"\nDecoded {len(fields)} top-level fields:")
|
||||
print("-" * 60)
|
||||
|
||||
for field in fields:
|
||||
# Try to decode nested messages
|
||||
if field.wire_type == WIRE_LENGTH:
|
||||
try_decode_nested(field)
|
||||
print_field(field, show_raw=show_raw)
|
||||
|
||||
|
||||
def read_pcap(filename):
|
||||
"""Read packets from pcap file."""
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append(pkt_data[payload_start:])
|
||||
return packets
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
pcap_file = sys.argv[1] if len(sys.argv) > 1 else "raymarine_sample_TWD_62-70_HDG_29-35.pcap"
|
||||
|
||||
print(f"Reading {pcap_file}...")
|
||||
packets = read_pcap(pcap_file)
|
||||
print(f"Loaded {len(packets)} packets")
|
||||
|
||||
# Group by size
|
||||
by_size = {}
|
||||
for pkt in packets:
|
||||
pkt_len = len(pkt)
|
||||
if pkt_len not in by_size:
|
||||
by_size[pkt_len] = []
|
||||
by_size[pkt_len].append(pkt)
|
||||
|
||||
# Analyze one packet of each key size
|
||||
target_sizes = [344, 446, 788, 888, 1472]
|
||||
|
||||
for size in target_sizes:
|
||||
if size in by_size:
|
||||
print("\n" + "=" * 70)
|
||||
print(f"ANALYZING {size}-BYTE PACKET")
|
||||
print("=" * 70)
|
||||
analyze_packet(by_size[size][0], show_raw=True)
|
||||
806
axiom-nmea/debug/raymarine_decoder.py
Executable file
806
axiom-nmea/debug/raymarine_decoder.py
Executable file
@@ -0,0 +1,806 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Raymarine LightHouse Network Decoder
|
||||
|
||||
Decodes sensor data from Raymarine AXIOM MFDs broadcast over UDP multicast.
|
||||
The protocol uses Protocol Buffers binary encoding (not standard NMEA 0183).
|
||||
|
||||
Usage:
|
||||
python raymarine_decoder.py -i 198.18.5.5
|
||||
python raymarine_decoder.py -i 198.18.5.5 --json
|
||||
python raymarine_decoder.py --pcap raymarine_sample.pcap
|
||||
|
||||
Multicast Groups:
|
||||
226.192.206.98:2561 - Navigation sensors
|
||||
226.192.206.99:2562 - Heartbeat/status
|
||||
226.192.206.102:2565 - Mixed sensor data
|
||||
226.192.219.0:3221 - Display sync
|
||||
|
||||
Author: Reverse-engineered from Raymarine network captures
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
import socket
|
||||
import struct
|
||||
import sys
|
||||
import threading
|
||||
import time
|
||||
from collections import deque
|
||||
from datetime import datetime
|
||||
from typing import Dict, List, Optional, Tuple, Any
|
||||
|
||||
|
||||
# Raymarine multicast configuration
|
||||
MULTICAST_GROUPS = [
|
||||
("226.192.206.98", 2561), # Navigation sensors
|
||||
("226.192.206.99", 2562), # Heartbeat/status
|
||||
("226.192.206.102", 2565), # Mixed sensor data (primary)
|
||||
("226.192.219.0", 3221), # Display sync
|
||||
]
|
||||
|
||||
# Conversion constants
|
||||
RADIANS_TO_DEGREES = 57.2957795131
|
||||
MS_TO_KNOTS = 1.94384449
|
||||
FEET_TO_METERS = 0.3048
|
||||
|
||||
|
||||
class SensorData:
|
||||
"""Holds the current state of all decoded sensor values."""
|
||||
|
||||
def __init__(self):
|
||||
self.latitude: Optional[float] = None
|
||||
self.longitude: Optional[float] = None
|
||||
self.heading_deg: Optional[float] = None
|
||||
self.wind_speed_kts: Optional[float] = None
|
||||
self.wind_direction_deg: Optional[float] = None
|
||||
self.depth_ft: Optional[float] = None
|
||||
self.water_temp_c: Optional[float] = None
|
||||
self.air_temp_c: Optional[float] = None
|
||||
self.sog_kts: Optional[float] = None # Speed over ground
|
||||
self.cog_deg: Optional[float] = None # Course over ground
|
||||
|
||||
# Timestamps for freshness tracking
|
||||
self.gps_time: float = 0
|
||||
self.heading_time: float = 0
|
||||
self.wind_time: float = 0
|
||||
self.depth_time: float = 0
|
||||
self.temp_time: float = 0
|
||||
|
||||
# Statistics
|
||||
self.packet_count: int = 0
|
||||
self.gps_count: int = 0
|
||||
self.start_time: float = time.time()
|
||||
|
||||
# Thread safety
|
||||
self.lock = threading.Lock()
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert sensor data to dictionary for JSON output."""
|
||||
with self.lock:
|
||||
return {
|
||||
"timestamp": datetime.now().isoformat(),
|
||||
"position": {
|
||||
"latitude": self.latitude,
|
||||
"longitude": self.longitude,
|
||||
"age_seconds": time.time() - self.gps_time if self.gps_time else None,
|
||||
},
|
||||
"navigation": {
|
||||
"heading_deg": self.heading_deg,
|
||||
"sog_kts": self.sog_kts,
|
||||
"cog_deg": self.cog_deg,
|
||||
},
|
||||
"wind": {
|
||||
"speed_kts": self.wind_speed_kts,
|
||||
"direction_deg": self.wind_direction_deg,
|
||||
},
|
||||
"depth": {
|
||||
"feet": self.depth_ft,
|
||||
"meters": self.depth_ft * FEET_TO_METERS if self.depth_ft else None,
|
||||
},
|
||||
"temperature": {
|
||||
"water_c": self.water_temp_c,
|
||||
"air_c": self.air_temp_c,
|
||||
},
|
||||
"statistics": {
|
||||
"packets_received": self.packet_count,
|
||||
"gps_packets": self.gps_count,
|
||||
"uptime_seconds": time.time() - self.start_time,
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
class ProtobufDecoder:
|
||||
"""
|
||||
Decodes Raymarine's protobuf-like binary format.
|
||||
|
||||
Wire types:
|
||||
0 = Varint
|
||||
1 = 64-bit (fixed64, double)
|
||||
2 = Length-delimited (string, bytes, nested message)
|
||||
5 = 32-bit (fixed32, float)
|
||||
"""
|
||||
|
||||
@staticmethod
|
||||
def decode_varint(data: bytes, offset: int) -> Tuple[int, int]:
|
||||
"""Decode a protobuf varint, return (value, bytes_consumed)."""
|
||||
result = 0
|
||||
shift = 0
|
||||
consumed = 0
|
||||
|
||||
while offset + consumed < len(data):
|
||||
byte = data[offset + consumed]
|
||||
result |= (byte & 0x7F) << shift
|
||||
consumed += 1
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
if shift > 63:
|
||||
break
|
||||
|
||||
return result, consumed
|
||||
|
||||
@staticmethod
|
||||
def decode_double(data: bytes, offset: int) -> Optional[float]:
|
||||
"""Decode a little-endian 64-bit double."""
|
||||
if offset + 8 > len(data):
|
||||
return None
|
||||
try:
|
||||
return struct.unpack('<d', data[offset:offset+8])[0]
|
||||
except struct.error:
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def decode_float(data: bytes, offset: int) -> Optional[float]:
|
||||
"""Decode a little-endian 32-bit float."""
|
||||
if offset + 4 > len(data):
|
||||
return None
|
||||
try:
|
||||
return struct.unpack('<f', data[offset:offset+4])[0]
|
||||
except struct.error:
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def is_valid_latitude(val: float) -> bool:
|
||||
"""Check if value is a valid latitude."""
|
||||
return -90 <= val <= 90
|
||||
|
||||
@staticmethod
|
||||
def is_valid_longitude(val: float) -> bool:
|
||||
"""Check if value is a valid longitude."""
|
||||
return -180 <= val <= 180
|
||||
|
||||
@staticmethod
|
||||
def is_valid_angle_radians(val: float) -> bool:
|
||||
"""Check if value is a valid angle in radians (0 to 2*pi)."""
|
||||
return 0 <= val <= 6.5 # Slightly more than 2*pi for tolerance
|
||||
|
||||
@staticmethod
|
||||
def is_valid_speed_ms(val: float) -> bool:
|
||||
"""Check if value is a reasonable speed in m/s (0 to ~50 m/s = ~100 kts)."""
|
||||
return 0 <= val <= 60
|
||||
|
||||
|
||||
class RaymarineDecoder:
|
||||
"""
|
||||
Main decoder for Raymarine network packets.
|
||||
|
||||
Uses GPS-anchored parsing strategy:
|
||||
1. Find GPS using reliable 0x09/0x11 pattern at offset ~0x0032
|
||||
2. Extract other values at known offsets relative to GPS or packet start
|
||||
"""
|
||||
|
||||
def __init__(self, sensor_data: SensorData, verbose: bool = False):
|
||||
self.sensor_data = sensor_data
|
||||
self.verbose = verbose
|
||||
self.pb = ProtobufDecoder()
|
||||
|
||||
# Packet size categories for different parsing strategies
|
||||
self.SMALL_PACKETS = range(0, 200)
|
||||
self.MEDIUM_PACKETS = range(200, 600)
|
||||
self.LARGE_PACKETS = range(600, 1200)
|
||||
self.XLARGE_PACKETS = range(1200, 3000)
|
||||
|
||||
def decode_packet(self, data: bytes, source: Tuple[str, int]) -> bool:
|
||||
"""
|
||||
Decode a single UDP packet.
|
||||
Returns True if any useful data was extracted.
|
||||
"""
|
||||
with self.sensor_data.lock:
|
||||
self.sensor_data.packet_count += 1
|
||||
|
||||
if len(data) < 50:
|
||||
return False # Too small to contain useful data
|
||||
|
||||
decoded_something = False
|
||||
|
||||
# Try GPS extraction (most reliable)
|
||||
if self._extract_gps(data):
|
||||
decoded_something = True
|
||||
|
||||
# Try extracting other sensor data based on packet size
|
||||
pkt_len = len(data)
|
||||
|
||||
if pkt_len in self.LARGE_PACKETS or pkt_len in self.XLARGE_PACKETS:
|
||||
# Large packets typically have full sensor data
|
||||
if self._extract_navigation(data):
|
||||
decoded_something = True
|
||||
if self._extract_wind(data):
|
||||
decoded_something = True
|
||||
if self._extract_depth(data):
|
||||
decoded_something = True
|
||||
if self._extract_temperature(data):
|
||||
decoded_something = True
|
||||
|
||||
elif pkt_len in self.MEDIUM_PACKETS:
|
||||
# Medium packets may have partial data
|
||||
if self._extract_wind(data):
|
||||
decoded_something = True
|
||||
if self._extract_depth(data):
|
||||
decoded_something = True
|
||||
|
||||
return decoded_something
|
||||
|
||||
def _extract_gps(self, data: bytes) -> bool:
|
||||
"""
|
||||
Extract GPS coordinates using the 0x09/0x11 pattern.
|
||||
|
||||
Pattern:
|
||||
0x09 [8-byte latitude double] 0x11 [8-byte longitude double]
|
||||
|
||||
Returns True if valid GPS was found.
|
||||
"""
|
||||
# Scan for the GPS pattern starting around offset 0x30
|
||||
search_start = 0x20
|
||||
search_end = min(len(data) - 18, 0x100)
|
||||
|
||||
for offset in range(search_start, search_end):
|
||||
if data[offset] != 0x09:
|
||||
continue
|
||||
|
||||
# Check if 0x11 follows at expected position
|
||||
lon_tag_offset = offset + 9
|
||||
if lon_tag_offset >= len(data) or data[lon_tag_offset] != 0x11:
|
||||
continue
|
||||
|
||||
# Decode latitude and longitude
|
||||
lat = self.pb.decode_double(data, offset + 1)
|
||||
lon = self.pb.decode_double(data, lon_tag_offset + 1)
|
||||
|
||||
if lat is None or lon is None:
|
||||
continue
|
||||
|
||||
# Validate coordinates
|
||||
if not self.pb.is_valid_latitude(lat) or not self.pb.is_valid_longitude(lon):
|
||||
continue
|
||||
|
||||
# Additional sanity check: filter out obviously wrong values
|
||||
# Most readings should be reasonable coordinates, not near 0,0
|
||||
if abs(lat) < 0.1 and abs(lon) < 0.1:
|
||||
continue
|
||||
|
||||
with self.sensor_data.lock:
|
||||
self.sensor_data.latitude = lat
|
||||
self.sensor_data.longitude = lon
|
||||
self.sensor_data.gps_time = time.time()
|
||||
self.sensor_data.gps_count += 1
|
||||
|
||||
if self.verbose:
|
||||
print(f"GPS: {lat:.6f}, {lon:.6f} (offset 0x{offset:04x})")
|
||||
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def _extract_navigation(self, data: bytes) -> bool:
|
||||
"""
|
||||
Extract heading, SOG, COG from packet.
|
||||
These are typically 32-bit floats in radians.
|
||||
"""
|
||||
found = False
|
||||
|
||||
# Look for heading at known offsets for large packets
|
||||
heading_offsets = [0x006f, 0x00d4, 0x0073, 0x00d8]
|
||||
|
||||
for offset in heading_offsets:
|
||||
if offset + 4 > len(data):
|
||||
continue
|
||||
|
||||
# Check for float tag (wire type 5)
|
||||
if offset > 0 and (data[offset - 1] & 0x07) == 5:
|
||||
val = self.pb.decode_float(data, offset)
|
||||
if val and self.pb.is_valid_angle_radians(val):
|
||||
heading_deg = val * RADIANS_TO_DEGREES
|
||||
with self.sensor_data.lock:
|
||||
self.sensor_data.heading_deg = heading_deg % 360
|
||||
self.sensor_data.heading_time = time.time()
|
||||
found = True
|
||||
break
|
||||
|
||||
return found
|
||||
|
||||
def _extract_wind(self, data: bytes) -> bool:
|
||||
"""
|
||||
Extract wind speed and direction.
|
||||
Wind speed is in m/s, direction in radians.
|
||||
|
||||
Known offsets by packet size (discovered via pcap analysis):
|
||||
- 344 bytes: speed @ 0x00a5, dir @ 0x00a0
|
||||
- 446 bytes: speed @ 0x00ac, dir @ 0x00a7
|
||||
- 788 bytes: speed @ 0x00ca, dir @ 0x00c5
|
||||
- 888 bytes: speed @ 0x00ca, dir @ 0x00c5
|
||||
- 931 bytes: speed @ 0x00ca, dir @ 0x00c5
|
||||
- 1031 bytes: speed @ 0x00ca, dir @ 0x00c5
|
||||
- 1472 bytes: speed @ 0x0101, dir @ 0x00fc
|
||||
|
||||
Note: 878-byte packets do NOT contain wind data at these offsets.
|
||||
"""
|
||||
pkt_len = len(data)
|
||||
|
||||
# Define offset pairs (speed_offset, dir_offset) for SPECIFIC packet sizes
|
||||
# Only process packet sizes known to contain wind data
|
||||
offset_pairs = None
|
||||
|
||||
if pkt_len == 344:
|
||||
offset_pairs = [(0x00a5, 0x00a0)]
|
||||
elif pkt_len == 446:
|
||||
offset_pairs = [(0x00ac, 0x00a7)]
|
||||
elif pkt_len in (788, 888, 931, 1031):
|
||||
offset_pairs = [(0x00ca, 0x00c5)]
|
||||
elif pkt_len == 1472:
|
||||
offset_pairs = [(0x0101, 0x00fc)]
|
||||
|
||||
# Skip unknown packet sizes to avoid garbage values
|
||||
if offset_pairs is None:
|
||||
return False
|
||||
|
||||
for speed_offset, dir_offset in offset_pairs:
|
||||
if speed_offset + 4 > pkt_len or dir_offset + 4 > pkt_len:
|
||||
continue
|
||||
|
||||
speed_val = self.pb.decode_float(data, speed_offset)
|
||||
dir_val = self.pb.decode_float(data, dir_offset)
|
||||
|
||||
if speed_val is None or dir_val is None:
|
||||
continue
|
||||
|
||||
# Validate: speed 0.1-50 m/s (~0.2-97 kts), direction 0-2*pi radians
|
||||
if not (0.1 < speed_val < 50):
|
||||
continue
|
||||
if not (0 <= dir_val <= 6.5):
|
||||
continue
|
||||
|
||||
# Convert and store
|
||||
with self.sensor_data.lock:
|
||||
self.sensor_data.wind_speed_kts = speed_val * MS_TO_KNOTS
|
||||
self.sensor_data.wind_direction_deg = (dir_val * RADIANS_TO_DEGREES) % 360
|
||||
self.sensor_data.wind_time = time.time()
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def _extract_depth(self, data: bytes) -> bool:
|
||||
"""
|
||||
Extract depth value (in feet, stored as 64-bit double).
|
||||
Depth is tagged with field 5 (0x29) or field 11 (0x59) wire type 1.
|
||||
"""
|
||||
# Search for depth by looking for wire type 1 tags with field 5 or 11
|
||||
# Tag format: (field_number << 3) | wire_type
|
||||
# Field 5, wire type 1 = (5 << 3) | 1 = 0x29
|
||||
# Field 11, wire type 1 = (11 << 3) | 1 = 0x59
|
||||
|
||||
depth_tags = [0x29, 0x59] # Field 5 and 11, wire type 1
|
||||
|
||||
for offset in range(0x40, min(len(data) - 9, 0x300)):
|
||||
tag = data[offset]
|
||||
if tag not in depth_tags:
|
||||
continue
|
||||
|
||||
val = self.pb.decode_double(data, offset + 1)
|
||||
if val is None:
|
||||
continue
|
||||
|
||||
# Reasonable depth range: 0.5 to 500 feet
|
||||
if 0.5 < val < 500:
|
||||
with self.sensor_data.lock:
|
||||
self.sensor_data.depth_ft = val
|
||||
self.sensor_data.depth_time = time.time()
|
||||
return True
|
||||
|
||||
# Fallback: scan for any reasonable depth-like double values
|
||||
# in larger packets where we have more sensor data
|
||||
if len(data) > 800:
|
||||
for offset in range(0x80, min(len(data) - 9, 0x200)):
|
||||
# Only check positions that look like protobuf fields
|
||||
tag = data[offset]
|
||||
if (tag & 0x07) != 1: # Wire type 1 (double)
|
||||
continue
|
||||
|
||||
val = self.pb.decode_double(data, offset + 1)
|
||||
if val is None:
|
||||
continue
|
||||
|
||||
# Typical depth range for Florida Keys: 3-50 feet
|
||||
if 2 < val < 100:
|
||||
with self.sensor_data.lock:
|
||||
self.sensor_data.depth_ft = val
|
||||
self.sensor_data.depth_time = time.time()
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def _extract_temperature(self, data: bytes) -> bool:
|
||||
"""
|
||||
Extract temperature values (water and air).
|
||||
Temperature encoding is not yet fully understood.
|
||||
Might be in Kelvin, Celsius, or Fahrenheit.
|
||||
|
||||
Note: Temperature extraction is experimental and may not produce
|
||||
reliable results without the proprietary protobuf schema.
|
||||
"""
|
||||
# Temperature extraction is currently unreliable
|
||||
# The protocol documentation notes temperature has not been found
|
||||
# Returning False to avoid displaying garbage values
|
||||
# TODO: Implement when temperature field offsets are discovered
|
||||
|
||||
# Search for temperature-like values with stricter validation
|
||||
# Only look at specific wire type 1 (double) fields
|
||||
for offset in range(0x50, min(len(data) - 9, 0x200)):
|
||||
# Must be preceded by a wire type 1 tag
|
||||
tag = data[offset]
|
||||
if (tag & 0x07) != 1: # Wire type 1 = 64-bit
|
||||
continue
|
||||
|
||||
field_num = tag >> 3
|
||||
# Temperature fields are likely in a reasonable field number range
|
||||
if field_num < 1 or field_num > 30:
|
||||
continue
|
||||
|
||||
val = self.pb.decode_double(data, offset + 1)
|
||||
if val is None:
|
||||
continue
|
||||
|
||||
# Very strict validation for Kelvin range (water temp 15-35°C)
|
||||
if 288 < val < 308: # 15°C to 35°C in Kelvin
|
||||
temp_c = val - 273.15
|
||||
with self.sensor_data.lock:
|
||||
if self.sensor_data.water_temp_c is None:
|
||||
self.sensor_data.water_temp_c = temp_c
|
||||
self.sensor_data.temp_time = time.time()
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
|
||||
class MulticastListener:
|
||||
"""
|
||||
Listens on multiple multicast groups and feeds packets to the decoder.
|
||||
"""
|
||||
|
||||
def __init__(self, decoder: RaymarineDecoder, interface_ip: str,
|
||||
groups: List[Tuple[str, int]] = None):
|
||||
self.decoder = decoder
|
||||
self.interface_ip = interface_ip
|
||||
self.groups = groups or MULTICAST_GROUPS
|
||||
self.sockets: List[socket.socket] = []
|
||||
self.running = False
|
||||
self.threads: List[threading.Thread] = []
|
||||
|
||||
def _create_socket(self, group: str, port: int) -> Optional[socket.socket]:
|
||||
"""Create and configure a multicast socket."""
|
||||
try:
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
|
||||
# Try SO_REUSEPORT if available (Linux)
|
||||
if hasattr(socket, 'SO_REUSEPORT'):
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
|
||||
|
||||
# Bind to the port
|
||||
sock.bind(('', port))
|
||||
|
||||
# Join multicast group
|
||||
mreq = struct.pack("4s4s",
|
||||
socket.inet_aton(group),
|
||||
socket.inet_aton(self.interface_ip))
|
||||
sock.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, mreq)
|
||||
|
||||
# Set receive timeout
|
||||
sock.settimeout(1.0)
|
||||
|
||||
return sock
|
||||
except Exception as e:
|
||||
print(f"Error creating socket for {group}:{port}: {e}", file=sys.stderr)
|
||||
return None
|
||||
|
||||
def _listener_thread(self, sock: socket.socket, group: str, port: int):
|
||||
"""Thread function to listen on a single multicast group."""
|
||||
while self.running:
|
||||
try:
|
||||
data, addr = sock.recvfrom(65535)
|
||||
self.decoder.decode_packet(data, addr)
|
||||
except socket.timeout:
|
||||
continue
|
||||
except Exception as e:
|
||||
if self.running:
|
||||
print(f"Error receiving on {group}:{port}: {e}", file=sys.stderr)
|
||||
|
||||
def start(self):
|
||||
"""Start listening on all multicast groups."""
|
||||
self.running = True
|
||||
|
||||
for group, port in self.groups:
|
||||
sock = self._create_socket(group, port)
|
||||
if sock:
|
||||
self.sockets.append(sock)
|
||||
thread = threading.Thread(
|
||||
target=self._listener_thread,
|
||||
args=(sock, group, port),
|
||||
daemon=True
|
||||
)
|
||||
thread.start()
|
||||
self.threads.append(thread)
|
||||
print(f"Listening on {group}:{port}")
|
||||
|
||||
if not self.sockets:
|
||||
raise RuntimeError("Failed to create any multicast sockets")
|
||||
|
||||
def stop(self):
|
||||
"""Stop listening and clean up."""
|
||||
self.running = False
|
||||
|
||||
for thread in self.threads:
|
||||
thread.join(timeout=2.0)
|
||||
|
||||
for sock in self.sockets:
|
||||
try:
|
||||
sock.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
self.sockets = []
|
||||
self.threads = []
|
||||
|
||||
|
||||
class PcapReader:
|
||||
"""
|
||||
Read packets from a pcap file for offline analysis.
|
||||
Supports pcap format (not pcapng).
|
||||
"""
|
||||
|
||||
PCAP_MAGIC = 0xa1b2c3d4
|
||||
PCAP_MAGIC_SWAPPED = 0xd4c3b2a1
|
||||
|
||||
def __init__(self, filename: str):
|
||||
self.filename = filename
|
||||
self.swapped = False
|
||||
|
||||
def read_packets(self):
|
||||
"""Generator that yields (timestamp, data) tuples."""
|
||||
with open(self.filename, 'rb') as f:
|
||||
# Read global header
|
||||
header = f.read(24)
|
||||
if len(header) < 24:
|
||||
raise ValueError("Invalid pcap file: too short")
|
||||
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
if magic == self.PCAP_MAGIC:
|
||||
self.swapped = False
|
||||
elif magic == self.PCAP_MAGIC_SWAPPED:
|
||||
self.swapped = True
|
||||
else:
|
||||
raise ValueError(f"Invalid pcap magic: 0x{magic:08x}")
|
||||
|
||||
endian = '>' if self.swapped else '<'
|
||||
|
||||
# Read packets
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(
|
||||
f'{endian}IIII', pkt_header
|
||||
)
|
||||
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
# Skip Ethernet header (14 bytes) and IP header (20 bytes min)
|
||||
# and UDP header (8 bytes) to get to payload
|
||||
if len(pkt_data) > 42:
|
||||
# Check for IPv4
|
||||
if pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
udp_start = 14 + ip_header_len
|
||||
payload_start = udp_start + 8
|
||||
|
||||
if payload_start < len(pkt_data):
|
||||
# Extract source IP
|
||||
src_ip = '.'.join(str(b) for b in pkt_data[26:30])
|
||||
src_port = struct.unpack('!H', pkt_data[udp_start:udp_start+2])[0]
|
||||
|
||||
payload = pkt_data[payload_start:]
|
||||
yield (ts_sec + ts_usec / 1e6, payload, (src_ip, src_port))
|
||||
|
||||
|
||||
def format_lat_lon(lat: float, lon: float) -> str:
|
||||
"""Format coordinates as degrees and decimal minutes."""
|
||||
lat_dir = 'N' if lat >= 0 else 'S'
|
||||
lon_dir = 'E' if lon >= 0 else 'W'
|
||||
|
||||
lat = abs(lat)
|
||||
lon = abs(lon)
|
||||
|
||||
lat_deg = int(lat)
|
||||
lat_min = (lat - lat_deg) * 60
|
||||
|
||||
lon_deg = int(lon)
|
||||
lon_min = (lon - lon_deg) * 60
|
||||
|
||||
return f"{lat_deg:3d}° {lat_min:06.3f}' {lat_dir}, {lon_deg:3d}° {lon_min:06.3f}' {lon_dir}"
|
||||
|
||||
|
||||
def display_dashboard(sensor_data: SensorData):
|
||||
"""Display a simple text dashboard."""
|
||||
now = time.time()
|
||||
|
||||
# Clear screen
|
||||
print("\033[2J\033[H", end="")
|
||||
|
||||
# Header
|
||||
timestamp = datetime.now().strftime("%H:%M:%S")
|
||||
print("=" * 70)
|
||||
print(f" RAYMARINE DECODER {timestamp}")
|
||||
print("=" * 70)
|
||||
|
||||
with sensor_data.lock:
|
||||
# GPS
|
||||
if sensor_data.latitude is not None and sensor_data.longitude is not None:
|
||||
age = now - sensor_data.gps_time
|
||||
fresh = "OK" if age < 5 else "STALE"
|
||||
pos_str = format_lat_lon(sensor_data.latitude, sensor_data.longitude)
|
||||
print(f" GPS: {pos_str} [{fresh}]")
|
||||
else:
|
||||
print(" GPS: No data")
|
||||
|
||||
# Heading
|
||||
if sensor_data.heading_deg is not None:
|
||||
age = now - sensor_data.heading_time
|
||||
fresh = "OK" if age < 5 else "STALE"
|
||||
print(f" Heading: {sensor_data.heading_deg:6.1f}° [{fresh}]")
|
||||
else:
|
||||
print(" Heading: No data")
|
||||
|
||||
# Wind
|
||||
if sensor_data.wind_speed_kts is not None:
|
||||
age = now - sensor_data.wind_time
|
||||
fresh = "OK" if age < 5 else "STALE"
|
||||
dir_str = f"@ {sensor_data.wind_direction_deg:.0f}°" if sensor_data.wind_direction_deg else ""
|
||||
print(f" Wind: {sensor_data.wind_speed_kts:6.1f} kts {dir_str} [{fresh}]")
|
||||
else:
|
||||
print(" Wind: No data")
|
||||
|
||||
# Depth
|
||||
if sensor_data.depth_ft is not None:
|
||||
age = now - sensor_data.depth_time
|
||||
fresh = "OK" if age < 5 else "STALE"
|
||||
depth_m = sensor_data.depth_ft * FEET_TO_METERS
|
||||
print(f" Depth: {sensor_data.depth_ft:6.1f} ft ({depth_m:.1f} m) [{fresh}]")
|
||||
else:
|
||||
print(" Depth: No data")
|
||||
|
||||
# Temperature
|
||||
if sensor_data.water_temp_c is not None or sensor_data.air_temp_c is not None:
|
||||
water = f"{sensor_data.water_temp_c:.1f}°C" if sensor_data.water_temp_c else "---"
|
||||
air = f"{sensor_data.air_temp_c:.1f}°C" if sensor_data.air_temp_c else "---"
|
||||
print(f" Temp: Water: {water} Air: {air}")
|
||||
else:
|
||||
print(" Temp: No data")
|
||||
|
||||
print("-" * 70)
|
||||
uptime = now - sensor_data.start_time
|
||||
print(f" Packets: {sensor_data.packet_count} GPS fixes: {sensor_data.gps_count} Uptime: {uptime:.0f}s")
|
||||
|
||||
print("=" * 70)
|
||||
print(" Press Ctrl+C to exit")
|
||||
|
||||
|
||||
def output_json(sensor_data: SensorData):
|
||||
"""Output sensor data as JSON."""
|
||||
print(json.dumps(sensor_data.to_dict(), indent=2))
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Decode Raymarine LightHouse network data",
|
||||
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||
epilog="""
|
||||
Examples:
|
||||
%(prog)s -i 198.18.5.5 Live capture with dashboard
|
||||
%(prog)s -i 198.18.5.5 --json Live capture with JSON output
|
||||
%(prog)s --pcap capture.pcap Analyze pcap file
|
||||
|
||||
Multicast Groups:
|
||||
226.192.206.98:2561 Navigation sensors
|
||||
226.192.206.99:2562 Heartbeat/status
|
||||
226.192.206.102:2565 Mixed sensor data
|
||||
226.192.219.0:3221 Display sync
|
||||
"""
|
||||
)
|
||||
|
||||
parser.add_argument('-i', '--interface',
|
||||
help='Interface IP address for multicast binding')
|
||||
parser.add_argument('--pcap',
|
||||
help='Read from pcap file instead of live capture')
|
||||
parser.add_argument('--json', action='store_true',
|
||||
help='Output as JSON instead of dashboard')
|
||||
parser.add_argument('--json-interval', type=float, default=1.0,
|
||||
help='JSON output interval in seconds (default: 1.0)')
|
||||
parser.add_argument('-v', '--verbose', action='store_true',
|
||||
help='Verbose output')
|
||||
parser.add_argument('--group', action='append', nargs=2,
|
||||
metavar=('IP', 'PORT'),
|
||||
help='Additional multicast group to listen on')
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Validate arguments
|
||||
if not args.pcap and not args.interface:
|
||||
parser.error("Either --interface or --pcap is required")
|
||||
|
||||
# Initialize sensor data and decoder
|
||||
sensor_data = SensorData()
|
||||
decoder = RaymarineDecoder(sensor_data, verbose=args.verbose)
|
||||
|
||||
# Add custom groups if specified
|
||||
groups = list(MULTICAST_GROUPS)
|
||||
if args.group:
|
||||
for ip, port in args.group:
|
||||
groups.append((ip, int(port)))
|
||||
|
||||
if args.pcap:
|
||||
# Pcap file analysis
|
||||
print(f"Reading from {args.pcap}...")
|
||||
reader = PcapReader(args.pcap)
|
||||
|
||||
packet_count = 0
|
||||
for ts, data, source in reader.read_packets():
|
||||
decoder.decode_packet(data, source)
|
||||
packet_count += 1
|
||||
|
||||
print(f"\nProcessed {packet_count} packets")
|
||||
print("\nFinal sensor state:")
|
||||
print(json.dumps(sensor_data.to_dict(), indent=2))
|
||||
|
||||
else:
|
||||
# Live capture
|
||||
listener = MulticastListener(decoder, args.interface, groups)
|
||||
|
||||
try:
|
||||
listener.start()
|
||||
print(f"\nListening on interface {args.interface}")
|
||||
print("Waiting for data...\n")
|
||||
|
||||
while True:
|
||||
if args.json:
|
||||
output_json(sensor_data)
|
||||
else:
|
||||
display_dashboard(sensor_data)
|
||||
|
||||
time.sleep(args.json_interval if args.json else 0.5)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("\n\nStopping...")
|
||||
finally:
|
||||
listener.stop()
|
||||
|
||||
if not args.json:
|
||||
print("\nFinal sensor state:")
|
||||
print(json.dumps(sensor_data.to_dict(), indent=2))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
255
axiom-nmea/debug/tank_debug.py
Executable file
255
axiom-nmea/debug/tank_debug.py
Executable file
@@ -0,0 +1,255 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Tank Debug - Dump raw Field 16 entries to find missing IDs.
|
||||
"""
|
||||
|
||||
import struct
|
||||
import socket
|
||||
import time
|
||||
import threading
|
||||
|
||||
WIRE_VARINT = 0
|
||||
WIRE_FIXED64 = 1
|
||||
WIRE_LENGTH = 2
|
||||
WIRE_FIXED32 = 5
|
||||
|
||||
HEADER_SIZE = 20
|
||||
|
||||
MULTICAST_GROUPS = [
|
||||
("226.192.206.102", 2565), # Main sensor data with tanks
|
||||
]
|
||||
|
||||
|
||||
class ProtobufParser:
|
||||
def __init__(self, data: bytes):
|
||||
self.data = data
|
||||
self.pos = 0
|
||||
|
||||
def remaining(self):
|
||||
return len(self.data) - self.pos
|
||||
|
||||
def read_varint(self) -> int:
|
||||
result = 0
|
||||
shift = 0
|
||||
while self.pos < len(self.data):
|
||||
byte = self.data[self.pos]
|
||||
self.pos += 1
|
||||
result |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
return result
|
||||
|
||||
def read_fixed32(self) -> bytes:
|
||||
val = self.data[self.pos:self.pos + 4]
|
||||
self.pos += 4
|
||||
return val
|
||||
|
||||
def read_fixed64(self) -> bytes:
|
||||
val = self.data[self.pos:self.pos + 8]
|
||||
self.pos += 8
|
||||
return val
|
||||
|
||||
def read_length_delimited(self) -> bytes:
|
||||
length = self.read_varint()
|
||||
val = self.data[self.pos:self.pos + length]
|
||||
self.pos += length
|
||||
return val
|
||||
|
||||
def parse_all_field16(self):
|
||||
"""Parse and collect ALL Field 16 entries with full detail."""
|
||||
entries = []
|
||||
|
||||
while self.pos < len(self.data):
|
||||
if self.remaining() < 1:
|
||||
break
|
||||
try:
|
||||
start_pos = self.pos
|
||||
tag = self.read_varint()
|
||||
field_num = tag >> 3
|
||||
wire_type = tag & 0x07
|
||||
|
||||
if field_num == 0 or field_num > 1000:
|
||||
break
|
||||
|
||||
if wire_type == WIRE_VARINT:
|
||||
value = self.read_varint()
|
||||
elif wire_type == WIRE_FIXED64:
|
||||
value = self.read_fixed64()
|
||||
elif wire_type == WIRE_LENGTH:
|
||||
value = self.read_length_delimited()
|
||||
elif wire_type == WIRE_FIXED32:
|
||||
value = self.read_fixed32()
|
||||
else:
|
||||
break
|
||||
|
||||
# If this is Field 16, parse its contents in detail
|
||||
if field_num == 16 and wire_type == WIRE_LENGTH:
|
||||
entry = self.parse_tank_entry(value)
|
||||
entry['raw_hex'] = value.hex()
|
||||
entry['raw_len'] = len(value)
|
||||
entries.append(entry)
|
||||
|
||||
except:
|
||||
break
|
||||
|
||||
return entries
|
||||
|
||||
def parse_tank_entry(self, data: bytes) -> dict:
|
||||
"""Parse a single tank entry and return all fields."""
|
||||
entry = {'fields': {}}
|
||||
pos = 0
|
||||
|
||||
while pos < len(data):
|
||||
if pos >= len(data):
|
||||
break
|
||||
try:
|
||||
# Read tag
|
||||
tag_byte = data[pos]
|
||||
pos += 1
|
||||
|
||||
# Handle multi-byte varints for tag
|
||||
tag = tag_byte & 0x7F
|
||||
shift = 7
|
||||
while tag_byte & 0x80 and pos < len(data):
|
||||
tag_byte = data[pos]
|
||||
pos += 1
|
||||
tag |= (tag_byte & 0x7F) << shift
|
||||
shift += 7
|
||||
|
||||
field_num = tag >> 3
|
||||
wire_type = tag & 0x07
|
||||
|
||||
if field_num == 0 or field_num > 100:
|
||||
break
|
||||
|
||||
if wire_type == WIRE_VARINT:
|
||||
# Read varint value
|
||||
val = 0
|
||||
shift = 0
|
||||
while pos < len(data):
|
||||
byte = data[pos]
|
||||
pos += 1
|
||||
val |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
entry['fields'][field_num] = ('varint', val)
|
||||
|
||||
elif wire_type == WIRE_FIXED32:
|
||||
raw = data[pos:pos + 4]
|
||||
pos += 4
|
||||
try:
|
||||
f = struct.unpack('<f', raw)[0]
|
||||
entry['fields'][field_num] = ('float', f, raw.hex())
|
||||
except:
|
||||
entry['fields'][field_num] = ('fixed32', raw.hex())
|
||||
|
||||
elif wire_type == WIRE_FIXED64:
|
||||
raw = data[pos:pos + 8]
|
||||
pos += 8
|
||||
try:
|
||||
d = struct.unpack('<d', raw)[0]
|
||||
entry['fields'][field_num] = ('double', d, raw.hex())
|
||||
except:
|
||||
entry['fields'][field_num] = ('fixed64', raw.hex())
|
||||
|
||||
elif wire_type == WIRE_LENGTH:
|
||||
# Read length
|
||||
length = 0
|
||||
shift = 0
|
||||
while pos < len(data):
|
||||
byte = data[pos]
|
||||
pos += 1
|
||||
length |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
raw = data[pos:pos + length]
|
||||
pos += length
|
||||
entry['fields'][field_num] = ('bytes', len(raw), raw.hex()[:40])
|
||||
|
||||
else:
|
||||
break
|
||||
|
||||
except Exception as e:
|
||||
entry['parse_error'] = str(e)
|
||||
break
|
||||
|
||||
return entry
|
||||
|
||||
|
||||
def scan_packet(data: bytes):
|
||||
"""Scan a packet and dump all Field 16 entries."""
|
||||
if len(data) < HEADER_SIZE + 5:
|
||||
return
|
||||
|
||||
proto_data = data[HEADER_SIZE:]
|
||||
parser = ProtobufParser(proto_data)
|
||||
entries = parser.parse_all_field16()
|
||||
|
||||
if entries:
|
||||
print(f"\n{'='*70}")
|
||||
print(f"Packet size: {len(data)} bytes, Found {len(entries)} tank entries")
|
||||
print(f"{'='*70}")
|
||||
|
||||
for i, entry in enumerate(entries):
|
||||
fields = entry['fields']
|
||||
|
||||
# Extract known fields
|
||||
tank_id = fields.get(1, (None, None))[1] if 1 in fields else None
|
||||
status = fields.get(2, (None, None))[1] if 2 in fields else None
|
||||
level = fields.get(3, (None, None))[1] if 3 in fields else None
|
||||
|
||||
print(f"\n Entry {i+1}: (raw length: {entry['raw_len']} bytes)")
|
||||
print(f" Tank ID (field 1): {tank_id}")
|
||||
print(f" Status (field 2): {status}")
|
||||
print(f" Level (field 3): {level}")
|
||||
print(f" Raw hex: {entry['raw_hex'][:60]}{'...' if len(entry['raw_hex']) > 60 else ''}")
|
||||
print(f" All fields present: {sorted(fields.keys())}")
|
||||
|
||||
# Show any extra fields
|
||||
for fn, fv in sorted(fields.items()):
|
||||
if fn not in (1, 2, 3):
|
||||
print(f" Field {fn}: {fv}")
|
||||
|
||||
|
||||
def main():
|
||||
import argparse
|
||||
parser = argparse.ArgumentParser(description="Debug tank entries")
|
||||
parser.add_argument('-i', '--interface', required=True, help='Interface IP')
|
||||
parser.add_argument('-t', '--time', type=int, default=5, help='Capture time (seconds)')
|
||||
args = parser.parse_args()
|
||||
|
||||
print(f"Capturing tank data for {args.time} seconds...")
|
||||
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
sock.bind(('', 2565))
|
||||
mreq = struct.pack("4s4s", socket.inet_aton("226.192.206.102"), socket.inet_aton(args.interface))
|
||||
sock.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, mreq)
|
||||
sock.settimeout(1.0)
|
||||
|
||||
seen_sizes = set()
|
||||
end_time = time.time() + args.time
|
||||
|
||||
try:
|
||||
while time.time() < end_time:
|
||||
try:
|
||||
data, _ = sock.recvfrom(65535)
|
||||
# Only process each unique packet size once
|
||||
if len(data) not in seen_sizes:
|
||||
seen_sizes.add(len(data))
|
||||
scan_packet(data)
|
||||
except socket.timeout:
|
||||
continue
|
||||
except KeyboardInterrupt:
|
||||
pass
|
||||
finally:
|
||||
sock.close()
|
||||
|
||||
print("\n\nDone.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
246
axiom-nmea/debug/tank_finder.py
Executable file
246
axiom-nmea/debug/tank_finder.py
Executable file
@@ -0,0 +1,246 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Tank Finder - Scan all multicast groups for values matching expected tank levels.
|
||||
"""
|
||||
|
||||
import struct
|
||||
import socket
|
||||
import time
|
||||
import threading
|
||||
from typing import Dict, Any, Optional, List, Tuple
|
||||
|
||||
WIRE_VARINT = 0
|
||||
WIRE_FIXED64 = 1
|
||||
WIRE_LENGTH = 2
|
||||
WIRE_FIXED32 = 5
|
||||
|
||||
HEADER_SIZE = 20
|
||||
|
||||
MULTICAST_GROUPS = [
|
||||
("226.192.206.98", 2561),
|
||||
("226.192.206.99", 2562),
|
||||
("226.192.206.100", 2563),
|
||||
("226.192.206.101", 2564),
|
||||
("226.192.206.102", 2565),
|
||||
("226.192.219.0", 3221),
|
||||
("239.2.1.1", 2154),
|
||||
]
|
||||
|
||||
# Target values to find (tank levels)
|
||||
TARGET_VALUES = [
|
||||
(66, 70), # ~68% fuel tank
|
||||
(87, 91), # ~89% fuel tank
|
||||
(0.66, 0.70), # Decimal range
|
||||
(0.87, 0.91), # Decimal range
|
||||
]
|
||||
|
||||
|
||||
class ProtobufParser:
|
||||
def __init__(self, data: bytes):
|
||||
self.data = data
|
||||
self.pos = 0
|
||||
|
||||
def read_varint(self) -> int:
|
||||
result = 0
|
||||
shift = 0
|
||||
while self.pos < len(self.data):
|
||||
byte = self.data[self.pos]
|
||||
self.pos += 1
|
||||
result |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
return result
|
||||
|
||||
def parse(self, path: str = "") -> List[Tuple[str, str, Any]]:
|
||||
"""Parse and return list of (path, type, value) for all fields."""
|
||||
results = []
|
||||
while self.pos < len(self.data):
|
||||
try:
|
||||
tag = self.read_varint()
|
||||
field_num = tag >> 3
|
||||
wire_type = tag & 0x07
|
||||
|
||||
if field_num == 0 or field_num > 1000:
|
||||
break
|
||||
|
||||
field_path = f"{path}.{field_num}" if path else str(field_num)
|
||||
|
||||
if wire_type == WIRE_VARINT:
|
||||
value = self.read_varint()
|
||||
results.append((field_path, "varint", value))
|
||||
elif wire_type == WIRE_FIXED64:
|
||||
raw = self.data[self.pos:self.pos + 8]
|
||||
self.pos += 8
|
||||
try:
|
||||
d = struct.unpack('<d', raw)[0]
|
||||
if d == d: # not NaN
|
||||
results.append((field_path, "double", d))
|
||||
except:
|
||||
pass
|
||||
elif wire_type == WIRE_LENGTH:
|
||||
length = self.read_varint()
|
||||
raw = self.data[self.pos:self.pos + length]
|
||||
self.pos += length
|
||||
# Try to parse as nested
|
||||
try:
|
||||
nested = ProtobufParser(raw)
|
||||
nested_results = nested.parse(field_path)
|
||||
if nested_results:
|
||||
results.extend(nested_results)
|
||||
except:
|
||||
pass
|
||||
elif wire_type == WIRE_FIXED32:
|
||||
raw = self.data[self.pos:self.pos + 4]
|
||||
self.pos += 4
|
||||
try:
|
||||
f = struct.unpack('<f', raw)[0]
|
||||
if f == f: # not NaN
|
||||
results.append((field_path, "float", f))
|
||||
except:
|
||||
pass
|
||||
else:
|
||||
break
|
||||
except:
|
||||
break
|
||||
return results
|
||||
|
||||
|
||||
def is_target_value(val: float) -> bool:
|
||||
"""Check if value matches our target ranges."""
|
||||
for low, high in TARGET_VALUES:
|
||||
if low <= val <= high:
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def scan_packet(data: bytes, group: str, port: int):
|
||||
"""Scan a packet for target values."""
|
||||
if len(data) < HEADER_SIZE + 5:
|
||||
return
|
||||
|
||||
proto_data = data[HEADER_SIZE:]
|
||||
parser = ProtobufParser(proto_data)
|
||||
fields = parser.parse()
|
||||
|
||||
matches = []
|
||||
for path, vtype, value in fields:
|
||||
if isinstance(value, (int, float)) and is_target_value(value):
|
||||
matches.append((path, vtype, value))
|
||||
|
||||
if matches:
|
||||
print(f"\n{'='*60}")
|
||||
print(f"MATCH on {group}:{port} (packet size: {len(data)})")
|
||||
print(f"{'='*60}")
|
||||
for path, vtype, value in matches:
|
||||
print(f" Field {path} ({vtype}): {value}")
|
||||
|
||||
# Show all Field 16 entries (tank data) for context
|
||||
print(f"\nAll Field 16 (Tank) entries:")
|
||||
tank_entries = {}
|
||||
for path, vtype, value in fields:
|
||||
if path.startswith("16."):
|
||||
parts = path.split(".")
|
||||
if len(parts) >= 2:
|
||||
# Group by the implicit index (based on order seen)
|
||||
entry_key = path # We'll group differently
|
||||
tank_entries[path] = (vtype, value)
|
||||
|
||||
# Parse Field 16 entries properly - group consecutive 16.x fields
|
||||
current_tank = {}
|
||||
tank_list = []
|
||||
last_field = 0
|
||||
for path, vtype, value in fields:
|
||||
if path.startswith("16."):
|
||||
subfield = int(path.split(".")[1])
|
||||
# If we see a field number <= last, it's a new tank entry
|
||||
if subfield <= last_field and current_tank:
|
||||
tank_list.append(current_tank)
|
||||
current_tank = {}
|
||||
current_tank[subfield] = (vtype, value)
|
||||
last_field = subfield
|
||||
if current_tank:
|
||||
tank_list.append(current_tank)
|
||||
|
||||
for i, tank in enumerate(tank_list):
|
||||
tank_id = tank.get(1, (None, "?"))[1]
|
||||
status = tank.get(2, (None, "?"))[1]
|
||||
level = tank.get(3, (None, "?"))[1]
|
||||
print(f" Tank #{tank_id}: level={level}%, status={status}")
|
||||
|
||||
|
||||
class MulticastScanner:
|
||||
def __init__(self, interface_ip: str):
|
||||
self.interface_ip = interface_ip
|
||||
self.running = False
|
||||
self.lock = threading.Lock()
|
||||
|
||||
def _create_socket(self, group: str, port: int):
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
if hasattr(socket, 'SO_REUSEPORT'):
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
|
||||
sock.bind(('', port))
|
||||
mreq = struct.pack("4s4s", socket.inet_aton(group), socket.inet_aton(self.interface_ip))
|
||||
sock.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, mreq)
|
||||
sock.settimeout(1.0)
|
||||
return sock
|
||||
|
||||
def _listen(self, sock, group: str, port: int):
|
||||
seen_sizes = set()
|
||||
while self.running:
|
||||
try:
|
||||
data, _ = sock.recvfrom(65535)
|
||||
# Only process each unique packet size once per group
|
||||
size_key = len(data)
|
||||
if size_key not in seen_sizes:
|
||||
seen_sizes.add(size_key)
|
||||
with self.lock:
|
||||
scan_packet(data, group, port)
|
||||
except socket.timeout:
|
||||
continue
|
||||
except:
|
||||
pass
|
||||
|
||||
def start(self):
|
||||
self.running = True
|
||||
threads = []
|
||||
for group, port in MULTICAST_GROUPS:
|
||||
try:
|
||||
sock = self._create_socket(group, port)
|
||||
t = threading.Thread(target=self._listen, args=(sock, group, port), daemon=True)
|
||||
t.start()
|
||||
threads.append(t)
|
||||
print(f"Scanning {group}:{port}")
|
||||
except Exception as e:
|
||||
print(f"Error on {group}:{port}: {e}")
|
||||
return threads
|
||||
|
||||
def stop(self):
|
||||
self.running = False
|
||||
|
||||
|
||||
def main():
|
||||
import argparse
|
||||
parser = argparse.ArgumentParser(description="Find tank level values in multicast data")
|
||||
parser.add_argument('-i', '--interface', required=True, help='Interface IP')
|
||||
parser.add_argument('-t', '--time', type=int, default=10, help='Scan duration (seconds)')
|
||||
args = parser.parse_args()
|
||||
|
||||
print(f"Scanning for values around 37-40 (or 0.37-0.40)...")
|
||||
print(f"Will scan for {args.time} seconds\n")
|
||||
|
||||
scanner = MulticastScanner(args.interface)
|
||||
scanner.start()
|
||||
|
||||
try:
|
||||
time.sleep(args.time)
|
||||
except KeyboardInterrupt:
|
||||
pass
|
||||
finally:
|
||||
scanner.stop()
|
||||
print("\nDone scanning")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
301
axiom-nmea/debug/watch_field.py
Normal file
301
axiom-nmea/debug/watch_field.py
Normal file
@@ -0,0 +1,301 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Single Field Monitor - Watch a specific field across packets.
|
||||
|
||||
Usage:
|
||||
python3 watch_field.py -i 192.168.1.100 --field 7.1
|
||||
python3 watch_field.py --pcap capture.pcap --field 7.1
|
||||
python3 watch_field.py --pcap capture.pcap --field 13.4 # TWD
|
||||
"""
|
||||
|
||||
import struct
|
||||
import socket
|
||||
import time
|
||||
import argparse
|
||||
import threading
|
||||
from datetime import datetime
|
||||
from typing import Dict, Any, Optional, List
|
||||
|
||||
WIRE_VARINT = 0
|
||||
WIRE_FIXED64 = 1
|
||||
WIRE_LENGTH = 2
|
||||
WIRE_FIXED32 = 5
|
||||
|
||||
HEADER_SIZE = 20
|
||||
|
||||
MULTICAST_GROUPS = [
|
||||
("226.192.206.102", 2565),
|
||||
]
|
||||
|
||||
|
||||
class ProtobufParser:
|
||||
def __init__(self, data: bytes):
|
||||
self.data = data
|
||||
self.pos = 0
|
||||
|
||||
def read_varint(self) -> int:
|
||||
result = 0
|
||||
shift = 0
|
||||
while self.pos < len(self.data):
|
||||
byte = self.data[self.pos]
|
||||
self.pos += 1
|
||||
result |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
return result
|
||||
|
||||
def parse(self) -> Dict[int, Any]:
|
||||
fields = {}
|
||||
while self.pos < len(self.data):
|
||||
try:
|
||||
tag = self.read_varint()
|
||||
field_num = tag >> 3
|
||||
wire_type = tag & 0x07
|
||||
if field_num == 0 or field_num > 1000:
|
||||
break
|
||||
|
||||
if wire_type == WIRE_VARINT:
|
||||
value = self.read_varint()
|
||||
children = None
|
||||
elif wire_type == WIRE_FIXED64:
|
||||
value = self.data[self.pos:self.pos + 8]
|
||||
self.pos += 8
|
||||
children = None
|
||||
elif wire_type == WIRE_LENGTH:
|
||||
length = self.read_varint()
|
||||
value = self.data[self.pos:self.pos + length]
|
||||
self.pos += length
|
||||
try:
|
||||
nested = ProtobufParser(value)
|
||||
children = nested.parse()
|
||||
if nested.pos < len(value) * 0.5:
|
||||
children = None
|
||||
except:
|
||||
children = None
|
||||
elif wire_type == WIRE_FIXED32:
|
||||
value = self.data[self.pos:self.pos + 4]
|
||||
self.pos += 4
|
||||
children = None
|
||||
else:
|
||||
break
|
||||
|
||||
fields[field_num] = (wire_type, value, children)
|
||||
except:
|
||||
break
|
||||
return fields
|
||||
|
||||
|
||||
def get_field(fields: Dict, path: List[int]):
|
||||
"""Navigate to a specific field path like [7, 1] for field 7.1"""
|
||||
current = fields
|
||||
for i, field_num in enumerate(path):
|
||||
if field_num not in current:
|
||||
return None, None, None
|
||||
wire_type, value, children = current[field_num]
|
||||
if i == len(path) - 1:
|
||||
return wire_type, value, children
|
||||
if children is None:
|
||||
return None, None, None
|
||||
current = children
|
||||
return None, None, None
|
||||
|
||||
|
||||
def format_value(wire_type: int, value: Any) -> str:
|
||||
"""Format value with multiple interpretations."""
|
||||
results = []
|
||||
|
||||
if wire_type == WIRE_VARINT:
|
||||
results.append(f"int: {value}")
|
||||
|
||||
elif wire_type == WIRE_FIXED64:
|
||||
try:
|
||||
d = struct.unpack('<d', value)[0]
|
||||
if d == d: # Not NaN
|
||||
results.append(f"double: {d:.6f}")
|
||||
# Could be depth in feet
|
||||
if 0 < d < 1000:
|
||||
results.append(f" -> {d:.1f} ft = {d * 0.3048:.1f} m")
|
||||
except:
|
||||
pass
|
||||
results.append(f"hex: {value.hex()}")
|
||||
|
||||
elif wire_type == WIRE_FIXED32:
|
||||
try:
|
||||
f = struct.unpack('<f', value)[0]
|
||||
if f == f: # Not NaN
|
||||
results.append(f"float: {f:.4f}")
|
||||
if 0 <= f <= 6.5:
|
||||
results.append(f" -> as angle: {f * 57.2958:.1f}°")
|
||||
if 0 < f < 100:
|
||||
results.append(f" -> as m/s: {f * 1.94384:.1f} kts")
|
||||
if 0 < f < 1000:
|
||||
results.append(f" -> as depth: {f:.1f} ft = {f * 0.3048:.1f} m")
|
||||
except:
|
||||
pass
|
||||
results.append(f"hex: {value.hex()}")
|
||||
|
||||
elif wire_type == WIRE_LENGTH:
|
||||
results.append(f"bytes[{len(value)}]: {value[:20].hex()}...")
|
||||
|
||||
return " | ".join(results) if results else "?"
|
||||
|
||||
|
||||
def read_pcap(filename: str):
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append((ts_sec + ts_usec / 1e6, pkt_data[payload_start:]))
|
||||
return packets
|
||||
|
||||
|
||||
class LiveListener:
|
||||
def __init__(self, interface_ip: str):
|
||||
self.interface_ip = interface_ip
|
||||
self.running = False
|
||||
self.packets = []
|
||||
self.lock = threading.Lock()
|
||||
|
||||
def _create_socket(self, group: str, port: int):
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
if hasattr(socket, 'SO_REUSEPORT'):
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
|
||||
sock.bind(('', port))
|
||||
mreq = struct.pack("4s4s", socket.inet_aton(group), socket.inet_aton(self.interface_ip))
|
||||
sock.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, mreq)
|
||||
sock.settimeout(1.0)
|
||||
return sock
|
||||
|
||||
def _listen(self, sock):
|
||||
while self.running:
|
||||
try:
|
||||
data, _ = sock.recvfrom(65535)
|
||||
if len(data) >= 200:
|
||||
with self.lock:
|
||||
self.packets.append((time.time(), data))
|
||||
# Keep last 100
|
||||
if len(self.packets) > 100:
|
||||
self.packets = self.packets[-100:]
|
||||
except socket.timeout:
|
||||
continue
|
||||
except:
|
||||
pass
|
||||
|
||||
def start(self):
|
||||
self.running = True
|
||||
for group, port in MULTICAST_GROUPS:
|
||||
try:
|
||||
sock = self._create_socket(group, port)
|
||||
t = threading.Thread(target=self._listen, args=(sock,), daemon=True)
|
||||
t.start()
|
||||
except Exception as e:
|
||||
print(f"Error: {e}")
|
||||
|
||||
def get_latest(self):
|
||||
with self.lock:
|
||||
if self.packets:
|
||||
return self.packets[-1]
|
||||
return None, None
|
||||
|
||||
def stop(self):
|
||||
self.running = False
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(description="Watch a specific protobuf field")
|
||||
parser.add_argument('-i', '--interface', help='Interface IP for live capture')
|
||||
parser.add_argument('--pcap', help='Read from pcap file')
|
||||
parser.add_argument('-f', '--field', required=True, help='Field path like "7.1" or "13.4"')
|
||||
parser.add_argument('-n', '--count', type=int, default=20, help='Number of samples to show')
|
||||
parser.add_argument('-t', '--interval', type=float, default=1.0, help='Seconds between samples (live)')
|
||||
args = parser.parse_args()
|
||||
|
||||
if not args.pcap and not args.interface:
|
||||
parser.error("Either --interface or --pcap required")
|
||||
|
||||
# Parse field path
|
||||
field_path = [int(x) for x in args.field.split('.')]
|
||||
field_str = '.'.join(str(x) for x in field_path)
|
||||
|
||||
print(f"Watching Field {field_str}")
|
||||
print("=" * 80)
|
||||
|
||||
if args.pcap:
|
||||
packets = read_pcap(args.pcap)
|
||||
print(f"Loaded {len(packets)} packets from {args.pcap}\n")
|
||||
|
||||
count = 0
|
||||
for ts, pkt in packets:
|
||||
if len(pkt) < HEADER_SIZE + 20:
|
||||
continue
|
||||
|
||||
proto_data = pkt[HEADER_SIZE:]
|
||||
parser = ProtobufParser(proto_data)
|
||||
fields = parser.parse()
|
||||
|
||||
wire_type, value, children = get_field(fields, field_path)
|
||||
if wire_type is not None:
|
||||
timestamp = datetime.fromtimestamp(ts).strftime("%H:%M:%S.%f")[:-3]
|
||||
val_str = format_value(wire_type, value)
|
||||
print(f"[{timestamp}] {len(pkt):4d}B | Field {field_str}: {val_str}")
|
||||
count += 1
|
||||
if count >= args.count:
|
||||
break
|
||||
|
||||
if count == 0:
|
||||
print(f"Field {field_str} not found in any packets")
|
||||
|
||||
else:
|
||||
listener = LiveListener(args.interface)
|
||||
listener.start()
|
||||
print(f"Listening... showing {args.count} samples\n")
|
||||
|
||||
try:
|
||||
count = 0
|
||||
last_ts = 0
|
||||
while count < args.count:
|
||||
time.sleep(args.interval)
|
||||
ts, pkt = listener.get_latest()
|
||||
if pkt is None or ts == last_ts:
|
||||
continue
|
||||
last_ts = ts
|
||||
|
||||
proto_data = pkt[HEADER_SIZE:]
|
||||
parser = ProtobufParser(proto_data)
|
||||
fields = parser.parse()
|
||||
|
||||
wire_type, value, children = get_field(fields, field_path)
|
||||
if wire_type is not None:
|
||||
timestamp = datetime.fromtimestamp(ts).strftime("%H:%M:%S.%f")[:-3]
|
||||
val_str = format_value(wire_type, value)
|
||||
print(f"[{timestamp}] {len(pkt):4d}B | Field {field_str}: {val_str}")
|
||||
count += 1
|
||||
else:
|
||||
print(f"[{datetime.now().strftime('%H:%M:%S')}] Field {field_str} not found in {len(pkt)}B packet")
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("\nStopped")
|
||||
finally:
|
||||
listener.stop()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
146
axiom-nmea/debug/wind_finder.py
Normal file
146
axiom-nmea/debug/wind_finder.py
Normal file
@@ -0,0 +1,146 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Diagnostic tool to find wind speed and direction values in Raymarine packets.
|
||||
Searches for float values matching expected ranges.
|
||||
"""
|
||||
|
||||
import struct
|
||||
import sys
|
||||
from collections import defaultdict
|
||||
|
||||
# Expected values
|
||||
# Wind speed: 15-20 kts = 7.7-10.3 m/s
|
||||
# Wind direction: 60-90 degrees = 1.05-1.57 radians
|
||||
|
||||
EXPECTED_SPEED_MS_MIN = 7.0
|
||||
EXPECTED_SPEED_MS_MAX = 12.0
|
||||
EXPECTED_DIR_RAD_MIN = 1.0
|
||||
EXPECTED_DIR_RAD_MAX = 1.7
|
||||
|
||||
PCAP_MAGIC = 0xa1b2c3d4
|
||||
|
||||
def decode_float(data, offset):
|
||||
if offset + 4 > len(data):
|
||||
return None
|
||||
try:
|
||||
return struct.unpack('<f', data[offset:offset+4])[0]
|
||||
except:
|
||||
return None
|
||||
|
||||
def decode_double(data, offset):
|
||||
if offset + 8 > len(data):
|
||||
return None
|
||||
try:
|
||||
return struct.unpack('<d', data[offset:offset+8])[0]
|
||||
except:
|
||||
return None
|
||||
|
||||
def read_pcap(filename):
|
||||
"""Read packets from pcap file."""
|
||||
packets = []
|
||||
with open(filename, 'rb') as f:
|
||||
header = f.read(24)
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
swapped = magic == 0xd4c3b2a1
|
||||
endian = '>' if swapped else '<'
|
||||
|
||||
while True:
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(f'{endian}IIII', pkt_header)
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
# Extract UDP payload
|
||||
if len(pkt_data) > 42 and pkt_data[12:14] == b'\x08\x00':
|
||||
ip_header_len = (pkt_data[14] & 0x0F) * 4
|
||||
payload_start = 14 + ip_header_len + 8
|
||||
if payload_start < len(pkt_data):
|
||||
packets.append(pkt_data[payload_start:])
|
||||
return packets
|
||||
|
||||
def find_wind_candidates(packets):
|
||||
"""Find all float values that could be wind speed or direction."""
|
||||
|
||||
speed_candidates = defaultdict(list) # offset -> list of values
|
||||
dir_candidates = defaultdict(list)
|
||||
|
||||
for pkt_idx, data in enumerate(packets):
|
||||
if len(data) < 100:
|
||||
continue
|
||||
|
||||
# Search for 32-bit floats
|
||||
for offset in range(0x30, min(len(data) - 4, 0x300)):
|
||||
val = decode_float(data, offset)
|
||||
if val is None or val != val: # NaN check
|
||||
continue
|
||||
|
||||
# Check for wind speed range (m/s)
|
||||
if EXPECTED_SPEED_MS_MIN <= val <= EXPECTED_SPEED_MS_MAX:
|
||||
speed_candidates[offset].append((pkt_idx, val, len(data)))
|
||||
|
||||
# Check for direction range (radians)
|
||||
if EXPECTED_DIR_RAD_MIN <= val <= EXPECTED_DIR_RAD_MAX:
|
||||
dir_candidates[offset].append((pkt_idx, val, len(data)))
|
||||
|
||||
return speed_candidates, dir_candidates
|
||||
|
||||
def main():
|
||||
filename = sys.argv[1] if len(sys.argv) > 1 else "raymarine_sample.pcap"
|
||||
|
||||
print(f"Reading {filename}...")
|
||||
packets = read_pcap(filename)
|
||||
print(f"Loaded {len(packets)} packets\n")
|
||||
|
||||
print(f"Searching for wind speed values ({EXPECTED_SPEED_MS_MIN}-{EXPECTED_SPEED_MS_MAX} m/s)")
|
||||
print(f"Searching for wind direction values ({EXPECTED_DIR_RAD_MIN}-{EXPECTED_DIR_RAD_MAX} rad)\n")
|
||||
|
||||
speed_candidates, dir_candidates = find_wind_candidates(packets)
|
||||
|
||||
print("=" * 70)
|
||||
print("WIND SPEED CANDIDATES (m/s)")
|
||||
print("=" * 70)
|
||||
|
||||
# Sort by number of occurrences
|
||||
for offset in sorted(speed_candidates.keys(), key=lambda x: -len(speed_candidates[x]))[:15]:
|
||||
hits = speed_candidates[offset]
|
||||
values = [v for _, v, _ in hits]
|
||||
pkt_sizes = set(s for _, _, s in hits)
|
||||
avg_val = sum(values) / len(values)
|
||||
avg_kts = avg_val * 1.94384
|
||||
print(f" Offset 0x{offset:04x}: {len(hits):4d} hits, avg {avg_val:.2f} m/s ({avg_kts:.1f} kts), sizes: {sorted(pkt_sizes)[:5]}")
|
||||
|
||||
print("\n" + "=" * 70)
|
||||
print("WIND DIRECTION CANDIDATES (radians)")
|
||||
print("=" * 70)
|
||||
|
||||
for offset in sorted(dir_candidates.keys(), key=lambda x: -len(dir_candidates[x]))[:15]:
|
||||
hits = dir_candidates[offset]
|
||||
values = [v for _, v, _ in hits]
|
||||
pkt_sizes = set(s for _, _, s in hits)
|
||||
avg_val = sum(values) / len(values)
|
||||
avg_deg = avg_val * 57.2958
|
||||
print(f" Offset 0x{offset:04x}: {len(hits):4d} hits, avg {avg_val:.2f} rad ({avg_deg:.1f}°), sizes: {sorted(pkt_sizes)[:5]}")
|
||||
|
||||
# Look for paired speed+direction at consecutive offsets
|
||||
print("\n" + "=" * 70)
|
||||
print("SPEED+DIRECTION PAIRS (4 bytes apart)")
|
||||
print("=" * 70)
|
||||
|
||||
for speed_offset in speed_candidates:
|
||||
dir_offset = speed_offset + 4 # Next float
|
||||
if dir_offset in dir_candidates:
|
||||
speed_hits = len(speed_candidates[speed_offset])
|
||||
dir_hits = len(dir_candidates[dir_offset])
|
||||
if speed_hits > 5 and dir_hits > 5:
|
||||
speed_vals = [v for _, v, _ in speed_candidates[speed_offset]]
|
||||
dir_vals = [v for _, v, _ in dir_candidates[dir_offset]]
|
||||
avg_speed = sum(speed_vals) / len(speed_vals) * 1.94384
|
||||
avg_dir = sum(dir_vals) / len(dir_vals) * 57.2958
|
||||
print(f" Speed @ 0x{speed_offset:04x} ({speed_hits} hits), Dir @ 0x{dir_offset:04x} ({dir_hits} hits)")
|
||||
print(f" -> Avg: {avg_speed:.1f} kts @ {avg_dir:.1f}°")
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
487
axiom-nmea/examples/dbus-raymarine-publisher/README.md
Normal file
487
axiom-nmea/examples/dbus-raymarine-publisher/README.md
Normal file
@@ -0,0 +1,487 @@
|
||||
# Raymarine D-Bus Publisher
|
||||
|
||||
Publishes Raymarine sensor data to Venus OS via D-Bus, making it available to the Victron ecosystem (VRM, GX Touch display, etc.). Also runs an NMEA TCP server for navigation apps.
|
||||
|
||||
## Published Services
|
||||
|
||||
| Service | Description |
|
||||
|---------|-------------|
|
||||
| `com.victronenergy.gps.raymarine_0` | GPS position, speed, course |
|
||||
| `com.victronenergy.meteo.raymarine_0` | Wind direction/speed, air temp, pressure |
|
||||
| `com.victronenergy.navigation.raymarine_0` | Heading, depth, water temperature |
|
||||
| `com.victronenergy.tank.raymarine_tankN_0` | Tank level for each tank |
|
||||
| `com.victronenergy.battery.raymarine_batN_0` | Battery voltage for each battery |
|
||||
|
||||
## Quick Deployment
|
||||
|
||||
### 1. Build the package
|
||||
|
||||
From the `examples/dbus-raymarine-publisher` directory:
|
||||
|
||||
```bash
|
||||
./build-package.sh
|
||||
```
|
||||
|
||||
This creates `dbus-raymarine-publisher-1.0.0.tar.gz` with the following structure:
|
||||
```
|
||||
dbus-raymarine-publisher/
|
||||
raymarine_nmea/ # Python library
|
||||
venus_publisher.py # Publisher script
|
||||
service/ # Daemontools service files
|
||||
install.sh # Installation script
|
||||
uninstall.sh # Removal script
|
||||
README.md # This file
|
||||
VERSION # Build info
|
||||
```
|
||||
|
||||
### 2. Copy to Venus OS
|
||||
|
||||
```bash
|
||||
scp dbus-raymarine-publisher-1.0.0.tar.gz root@venus:/data/
|
||||
```
|
||||
|
||||
### 3. Extract on Venus OS
|
||||
|
||||
```bash
|
||||
ssh root@venus
|
||||
cd /data
|
||||
tar -xzf dbus-raymarine-publisher-1.0.0.tar.gz
|
||||
```
|
||||
|
||||
### 4. Run the installer
|
||||
|
||||
```bash
|
||||
bash /data/dbus-raymarine-publisher/install.sh
|
||||
```
|
||||
|
||||
The installer will:
|
||||
- Find velib_python and create a symlink
|
||||
- Prompt you to select a network interface (eth0, wlan0, or custom IP)
|
||||
- Configure the NMEA TCP server port
|
||||
- Install the daemontools service
|
||||
- Set up automatic logging
|
||||
- Configure rc.local for firmware update survival
|
||||
|
||||
### 4b. Or test manually first
|
||||
|
||||
```bash
|
||||
cd /data/dbus-raymarine-publisher
|
||||
python3 venus_publisher.py --interface eth0
|
||||
```
|
||||
|
||||
## Service Management
|
||||
|
||||
After installation, control the service with:
|
||||
|
||||
```bash
|
||||
# Check status
|
||||
svstat /service/dbus-raymarine-publisher
|
||||
|
||||
# View logs
|
||||
tail -F /var/log/dbus-raymarine-publisher/current | tai64nlocal
|
||||
|
||||
# Stop service
|
||||
svc -d /service/dbus-raymarine-publisher
|
||||
|
||||
# Start service
|
||||
svc -u /service/dbus-raymarine-publisher
|
||||
|
||||
# Restart service
|
||||
svc -t /service/dbus-raymarine-publisher
|
||||
```
|
||||
|
||||
## Changing the Network Interface
|
||||
|
||||
Edit the run script:
|
||||
```bash
|
||||
vi /data/dbus-raymarine-publisher/service/run
|
||||
```
|
||||
|
||||
Change the `INTERFACE` variable to `eth0`, `wlan0`, or a specific IP address.
|
||||
|
||||
Then restart the service:
|
||||
```bash
|
||||
svc -t /service/dbus-raymarine-publisher
|
||||
```
|
||||
|
||||
## Surviving Firmware Updates
|
||||
|
||||
The installer automatically configures `/data/rc.local` to restore the service symlink after Venus OS firmware updates.
|
||||
|
||||
## Uninstalling
|
||||
|
||||
```bash
|
||||
bash /data/dbus-raymarine-publisher/uninstall.sh
|
||||
```
|
||||
|
||||
To completely remove all files:
|
||||
```bash
|
||||
rm -rf /data/dbus-raymarine-publisher /var/log/dbus-raymarine-publisher
|
||||
```
|
||||
|
||||
## Command Line Options
|
||||
|
||||
| Option | Description |
|
||||
|--------|-------------|
|
||||
| `--interface IP` | Network interface or IP (default: 198.18.5.5) |
|
||||
| `--no-gps` | Disable GPS service |
|
||||
| `--no-meteo` | Disable Meteo (wind) service |
|
||||
| `--no-navigation` | Disable Navigation service (heading, depth, water temp) |
|
||||
| `--no-tanks` | Disable all Tank services |
|
||||
| `--no-batteries` | Disable all Battery services |
|
||||
| `--tank-ids 1,2,10` | Only publish specific tanks |
|
||||
| `--battery-ids 11,13` | Only publish specific batteries |
|
||||
| `--update-interval MS` | D-Bus update interval (default: 1000ms) |
|
||||
| `--nmea-tcp-port PORT` | NMEA TCP server port (default: 10110) |
|
||||
| `--no-nmea-tcp` | Disable NMEA TCP server |
|
||||
| `--debug` | Enable debug logging |
|
||||
| `--dry-run` | Listen without D-Bus registration |
|
||||
|
||||
## D-Bus Path Reference
|
||||
|
||||
### GPS (`com.victronenergy.gps`)
|
||||
|
||||
| Path | Description | Unit |
|
||||
|------|-------------|------|
|
||||
| `/Position/Latitude` | Latitude | degrees |
|
||||
| `/Position/Longitude` | Longitude | degrees |
|
||||
| `/Speed` | Speed over ground | m/s |
|
||||
| `/Course` | Course over ground | degrees |
|
||||
| `/Fix` | GPS fix status | 0=no fix, 1=fix |
|
||||
|
||||
### Meteo (`com.victronenergy.meteo`)
|
||||
|
||||
| Path | Description | Unit |
|
||||
|------|-------------|------|
|
||||
| `/WindDirection` | True wind direction | degrees |
|
||||
| `/WindSpeed` | True wind speed | m/s |
|
||||
| `/ExternalTemperature` | Air temperature | C |
|
||||
| `/Pressure` | Barometric pressure | hPa |
|
||||
|
||||
### Navigation (`com.victronenergy.navigation`)
|
||||
|
||||
| Path | Description | Unit |
|
||||
|------|-------------|------|
|
||||
| `/Heading` | True heading | degrees |
|
||||
| `/Depth` | Depth below transducer | m |
|
||||
| `/WaterTemperature` | Water temperature | C |
|
||||
|
||||
### Tank (`com.victronenergy.tank`)
|
||||
|
||||
| Path | Description | Unit |
|
||||
|------|-------------|------|
|
||||
| `/Level` | Tank level | 0-100% |
|
||||
| `/Remaining` | Remaining volume | m3 |
|
||||
| `/Capacity` | Tank capacity | m3 |
|
||||
| `/FluidType` | Fluid type | enum |
|
||||
| `/Status` | Sensor status | 0=OK |
|
||||
|
||||
Fluid types: 0=Fuel, 1=Fresh water, 2=Waste water, 3=Live well, 4=Oil, 5=Black water
|
||||
|
||||
### Battery (`com.victronenergy.battery`)
|
||||
|
||||
| Path | Description | Unit |
|
||||
|------|-------------|------|
|
||||
| `/Dc/0/Voltage` | Battery voltage | V |
|
||||
| `/Soc` | State of charge (estimated) | % |
|
||||
| `/Alarms/LowVoltage` | Low voltage alarm | 0/1/2 |
|
||||
| `/Alarms/HighVoltage` | High voltage alarm | 0/1/2 |
|
||||
|
||||
## Testing with dbus-spy
|
||||
|
||||
On Venus OS, use `dbus-spy` to view published data:
|
||||
|
||||
```bash
|
||||
# List all Raymarine services
|
||||
dbus -y | grep raymarine
|
||||
|
||||
# Read GPS position
|
||||
dbus -y com.victronenergy.gps.raymarine_0 /Position/Latitude GetValue
|
||||
dbus -y com.victronenergy.gps.raymarine_0 /Position/Longitude GetValue
|
||||
|
||||
# Read navigation data
|
||||
dbus -y com.victronenergy.navigation.raymarine_0 /Heading GetValue
|
||||
dbus -y com.victronenergy.navigation.raymarine_0 /Depth GetValue
|
||||
|
||||
# Read wind data
|
||||
dbus -y com.victronenergy.meteo.raymarine_0 /WindDirection GetValue
|
||||
dbus -y com.victronenergy.meteo.raymarine_0 /WindSpeed GetValue
|
||||
|
||||
# Read tank levels
|
||||
dbus -y com.victronenergy.tank.raymarine_tank1_0 /Level GetValue
|
||||
```
|
||||
|
||||
## Network Requirements
|
||||
|
||||
The Venus OS device must be connected to the same network as the Raymarine LightHouse MFD. The service will use the selected interface (eth0 or wlan0) and resolve the IP address at runtime, which works with DHCP.
|
||||
|
||||
If you need a specific VLAN IP (e.g., 198.18.x.x), you can add it manually:
|
||||
|
||||
```bash
|
||||
# Add VLAN interface (temporary)
|
||||
ip addr add 198.18.4.108/16 dev eth0
|
||||
|
||||
# Or configure in /etc/network/interfaces for persistence
|
||||
```
|
||||
|
||||
## MQTT Access
|
||||
|
||||
Venus OS includes an MQTT broker that mirrors all D-Bus values, allowing external systems (Home Assistant, Node-RED, SignalK, etc.) to access sensor data.
|
||||
|
||||
### Enabling MQTT
|
||||
|
||||
On the GX device (or Venus OS):
|
||||
|
||||
1. Go to **Settings > Services > MQTT**
|
||||
2. Enable **MQTT on LAN**
|
||||
3. Optionally enable **MQTT on LAN (SSL)** for encrypted connections
|
||||
|
||||
Default ports:
|
||||
- **1883** - MQTT (unencrypted)
|
||||
- **8883** - MQTT with SSL
|
||||
|
||||
### MQTT Topic Structure
|
||||
|
||||
Venus OS uses this topic structure:
|
||||
|
||||
```
|
||||
N/<portal_id>/<service_type>/<instance>/<path>
|
||||
```
|
||||
|
||||
Where:
|
||||
- `N/` - Notification topic (read values)
|
||||
- `<portal_id>` - Unique VRM portal ID (e.g., `b827eb123456`)
|
||||
- `<service_type>` - Service category (gps, tank, battery, meteo, navigation)
|
||||
- `<instance>` - Device instance number
|
||||
- `<path>` - D-Bus path without leading slash
|
||||
|
||||
To write values, use `W/` prefix instead of `N/`.
|
||||
|
||||
### Raymarine Sensor MQTT Topics
|
||||
|
||||
#### GPS Topics
|
||||
|
||||
| MQTT Topic | Description | Unit |
|
||||
|------------|-------------|------|
|
||||
| `N/<id>/gps/0/Position/Latitude` | Latitude | degrees |
|
||||
| `N/<id>/gps/0/Position/Longitude` | Longitude | degrees |
|
||||
| `N/<id>/gps/0/Speed` | Speed over ground | m/s |
|
||||
| `N/<id>/gps/0/Course` | Course over ground | degrees |
|
||||
| `N/<id>/gps/0/Fix` | GPS fix status | 0=no fix, 1=fix |
|
||||
|
||||
#### Meteo/Wind Topics
|
||||
|
||||
| MQTT Topic | Description | Unit |
|
||||
|------------|-------------|------|
|
||||
| `N/<id>/meteo/0/WindDirection` | True wind direction | degrees |
|
||||
| `N/<id>/meteo/0/WindSpeed` | True wind speed | m/s |
|
||||
| `N/<id>/meteo/0/ExternalTemperature` | Air temperature | C |
|
||||
| `N/<id>/meteo/0/Pressure` | Barometric pressure | hPa |
|
||||
|
||||
#### Navigation Topics
|
||||
|
||||
| MQTT Topic | Description | Unit |
|
||||
|------------|-------------|------|
|
||||
| `N/<id>/navigation/0/Heading` | True heading | degrees |
|
||||
| `N/<id>/navigation/0/Depth` | Depth below transducer | m |
|
||||
| `N/<id>/navigation/0/WaterTemperature` | Water temperature | C |
|
||||
|
||||
#### Tank Topics
|
||||
|
||||
Each tank is published as a separate instance. With default tank configuration:
|
||||
|
||||
| Tank | Instance | MQTT Base Topic |
|
||||
|------|----------|-----------------|
|
||||
| Fuel Starboard (ID 1) | 0 | `N/<id>/tank/0/...` |
|
||||
| Fuel Port (ID 2) | 1 | `N/<id>/tank/1/...` |
|
||||
| Water Bow (ID 10) | 2 | `N/<id>/tank/2/...` |
|
||||
| Water Stern (ID 11) | 3 | `N/<id>/tank/3/...` |
|
||||
| Black Water (ID 100) | 4 | `N/<id>/tank/4/...` |
|
||||
|
||||
Available paths per tank instance:
|
||||
|
||||
| Path Suffix | Description | Unit |
|
||||
|-------------|-------------|------|
|
||||
| `/Level` | Tank fill level | 0-100% |
|
||||
| `/Remaining` | Remaining volume | m3 |
|
||||
| `/Capacity` | Total capacity | m3 |
|
||||
| `/FluidType` | Fluid type enum | see below |
|
||||
| `/Status` | Sensor status | 0=OK |
|
||||
| `/CustomName` | Tank name | string |
|
||||
|
||||
Example full topics for Fuel Starboard tank:
|
||||
```
|
||||
N/<id>/tank/0/Level
|
||||
N/<id>/tank/0/Remaining
|
||||
N/<id>/tank/0/Capacity
|
||||
N/<id>/tank/0/FluidType
|
||||
```
|
||||
|
||||
#### Battery Topics
|
||||
|
||||
Each battery is published as a separate instance. With default battery configuration:
|
||||
|
||||
| Battery | Instance | MQTT Base Topic |
|
||||
|---------|----------|-----------------|
|
||||
| House Bow (ID 11) | 0 | `N/<id>/battery/0/...` |
|
||||
| House Stern (ID 13) | 1 | `N/<id>/battery/1/...` |
|
||||
| Engine Port (ID 1000) | 2 | `N/<id>/battery/2/...` |
|
||||
| Engine Starboard (ID 1001) | 3 | `N/<id>/battery/3/...` |
|
||||
|
||||
Available paths per battery instance:
|
||||
|
||||
| Path Suffix | Description | Unit |
|
||||
|-------------|-------------|------|
|
||||
| `/Dc/0/Voltage` | Battery voltage | V DC |
|
||||
| `/Soc` | State of charge (estimated) | 0-100% |
|
||||
| `/Alarms/LowVoltage` | Low voltage alarm | 0/1/2 |
|
||||
| `/Alarms/HighVoltage` | High voltage alarm | 0/1/2 |
|
||||
| `/CustomName` | Battery name | string |
|
||||
|
||||
Example full topics for House Bow battery:
|
||||
```
|
||||
N/<id>/battery/0/Dc/0/Voltage
|
||||
N/<id>/battery/0/Soc
|
||||
N/<id>/battery/0/Alarms/LowVoltage
|
||||
```
|
||||
|
||||
### MQTT Message Format
|
||||
|
||||
Values are published as JSON:
|
||||
|
||||
```json
|
||||
{"value": 25.4}
|
||||
```
|
||||
|
||||
For string values:
|
||||
```json
|
||||
{"value": "Fuel Starboard"}
|
||||
```
|
||||
|
||||
### Subscribing to All Raymarine Sensors
|
||||
|
||||
```bash
|
||||
# Subscribe to all GPS data
|
||||
mosquitto_sub -h <venus_ip> -t 'N/+/gps/#' -v
|
||||
|
||||
# Subscribe to all navigation data
|
||||
mosquitto_sub -h <venus_ip> -t 'N/+/navigation/#' -v
|
||||
|
||||
# Subscribe to all tank data
|
||||
mosquitto_sub -h <venus_ip> -t 'N/+/tank/#' -v
|
||||
|
||||
# Subscribe to all battery data
|
||||
mosquitto_sub -h <venus_ip> -t 'N/+/battery/#' -v
|
||||
|
||||
# Subscribe to all meteo/wind data
|
||||
mosquitto_sub -h <venus_ip> -t 'N/+/meteo/#' -v
|
||||
|
||||
# Subscribe to everything
|
||||
mosquitto_sub -h <venus_ip> -t 'N/#' -v
|
||||
```
|
||||
|
||||
### Keep-Alive Requirement
|
||||
|
||||
Venus OS MQTT requires periodic keep-alive messages to continue receiving updates. Send an empty message to the read topic:
|
||||
|
||||
```bash
|
||||
# Initial subscription request
|
||||
mosquitto_pub -h <venus_ip> -t 'R/<portal_id>/system/0/Serial' -m ''
|
||||
|
||||
# Or request all values
|
||||
mosquitto_pub -h <venus_ip> -t 'R/<portal_id>/keepalive' -m ''
|
||||
```
|
||||
|
||||
For continuous monitoring, send keep-alive every 30-60 seconds.
|
||||
|
||||
### Home Assistant Integration
|
||||
|
||||
Example MQTT sensor configuration for Home Assistant:
|
||||
|
||||
```yaml
|
||||
mqtt:
|
||||
sensor:
|
||||
# GPS Position
|
||||
- name: "Boat Latitude"
|
||||
state_topic: "N/<portal_id>/gps/0/Position/Latitude"
|
||||
value_template: "{{ value_json.value }}"
|
||||
unit_of_measurement: "deg"
|
||||
|
||||
- name: "Boat Longitude"
|
||||
state_topic: "N/<portal_id>/gps/0/Position/Longitude"
|
||||
value_template: "{{ value_json.value }}"
|
||||
unit_of_measurement: "deg"
|
||||
|
||||
- name: "Boat Speed"
|
||||
state_topic: "N/<portal_id>/gps/0/Speed"
|
||||
value_template: "{{ (value_json.value * 1.94384) | round(1) }}"
|
||||
unit_of_measurement: "kn"
|
||||
|
||||
# Navigation
|
||||
- name: "Boat Heading"
|
||||
state_topic: "N/<portal_id>/navigation/0/Heading"
|
||||
value_template: "{{ value_json.value }}"
|
||||
unit_of_measurement: "deg"
|
||||
|
||||
- name: "Water Depth"
|
||||
state_topic: "N/<portal_id>/navigation/0/Depth"
|
||||
value_template: "{{ (value_json.value * 3.28084) | round(1) }}"
|
||||
unit_of_measurement: "ft"
|
||||
|
||||
# Tank Levels
|
||||
- name: "Fuel Starboard"
|
||||
state_topic: "N/<portal_id>/tank/0/Level"
|
||||
value_template: "{{ value_json.value }}"
|
||||
unit_of_measurement: "%"
|
||||
device_class: battery
|
||||
|
||||
- name: "Fresh Water Bow"
|
||||
state_topic: "N/<portal_id>/tank/2/Level"
|
||||
value_template: "{{ value_json.value }}"
|
||||
unit_of_measurement: "%"
|
||||
|
||||
# Battery Voltage
|
||||
- name: "House Battery Voltage"
|
||||
state_topic: "N/<portal_id>/battery/0/Dc/0/Voltage"
|
||||
value_template: "{{ value_json.value | round(2) }}"
|
||||
unit_of_measurement: "V"
|
||||
device_class: voltage
|
||||
```
|
||||
|
||||
### Node-RED Integration
|
||||
|
||||
Example Node-RED flow to monitor tank levels:
|
||||
|
||||
1. Add **mqtt in** node subscribed to `N/<portal_id>/tank/+/Level`
|
||||
2. Add **json** node to parse the message
|
||||
3. Add **function** node:
|
||||
```javascript
|
||||
msg.payload = msg.payload.value;
|
||||
msg.topic = msg.topic.split('/')[3]; // Extract tank instance
|
||||
return msg;
|
||||
```
|
||||
4. Connect to dashboard gauge or further processing
|
||||
|
||||
### SignalK Integration
|
||||
|
||||
SignalK can subscribe to Venus MQTT and convert values to SignalK paths. Install the `signalk-venus-plugin` for automatic integration, or manually map MQTT topics in the SignalK server configuration.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### No D-Bus services registered
|
||||
- Check that velib_python is installed (pre-installed on Venus OS)
|
||||
- Ensure dbus-python with GLib support is available
|
||||
|
||||
### No data received
|
||||
- Verify network interface is configured correctly
|
||||
- Check that the Raymarine MFD is broadcasting on the network
|
||||
- Use `--debug` to see raw packet data
|
||||
|
||||
### Stale data
|
||||
- Data older than 10-30 seconds is marked as stale
|
||||
- Check network connectivity to Raymarine multicast groups
|
||||
|
||||
### Service won't start
|
||||
- Check logs: `tail /var/log/dbus-raymarine-publisher/current`
|
||||
- Verify run script is executable: `ls -la /service/dbus-raymarine-publisher/run`
|
||||
- Check service status: `svstat /service/dbus-raymarine-publisher`
|
||||
160
axiom-nmea/examples/dbus-raymarine-publisher/build-package.sh
Executable file
160
axiom-nmea/examples/dbus-raymarine-publisher/build-package.sh
Executable file
@@ -0,0 +1,160 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# Build script for Raymarine D-Bus Publisher Venus OS package
|
||||
#
|
||||
# Creates a tar.gz package that can be:
|
||||
# 1. Copied to a Venus OS device (Cerbo GX, Venus GX, etc.)
|
||||
# 2. Untarred to /data/
|
||||
# 3. Installed by running install.sh
|
||||
#
|
||||
# Usage:
|
||||
# ./build-package.sh # Creates package with default name
|
||||
# ./build-package.sh --version 1.0.0 # Creates package with version in name
|
||||
# ./build-package.sh --output /path/ # Specify output directory
|
||||
#
|
||||
# Installation on Venus OS:
|
||||
# scp dbus-raymarine-publisher-*.tar.gz root@<device-ip>:/data/
|
||||
# ssh root@<device-ip>
|
||||
# cd /data && tar -xzf dbus-raymarine-publisher-*.tar.gz
|
||||
# bash /data/dbus-raymarine-publisher/install.sh
|
||||
#
|
||||
|
||||
set -e
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
PROJECT_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
|
||||
|
||||
VERSION="1.0.0"
|
||||
OUTPUT_DIR="$SCRIPT_DIR"
|
||||
PACKAGE_NAME="dbus-raymarine-publisher"
|
||||
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case $1 in
|
||||
--version|-v)
|
||||
VERSION="$2"
|
||||
shift 2
|
||||
;;
|
||||
--output|-o)
|
||||
OUTPUT_DIR="$2"
|
||||
shift 2
|
||||
;;
|
||||
--help|-h)
|
||||
echo "Usage: $0 [OPTIONS]"
|
||||
echo ""
|
||||
echo "Options:"
|
||||
echo " -v, --version VERSION Set package version (default: 1.0.0)"
|
||||
echo " -o, --output PATH Output directory (default: script directory)"
|
||||
echo " -h, --help Show this help message"
|
||||
exit 0
|
||||
;;
|
||||
*)
|
||||
echo "Unknown option: $1"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
BUILD_DATE=$(date -u +"%Y-%m-%d %H:%M:%S UTC")
|
||||
BUILD_TIMESTAMP=$(date +%Y%m%d%H%M%S)
|
||||
|
||||
BUILD_DIR=$(mktemp -d)
|
||||
PACKAGE_DIR="$BUILD_DIR/$PACKAGE_NAME"
|
||||
|
||||
echo "=================================================="
|
||||
echo "Building $PACKAGE_NAME package"
|
||||
echo "=================================================="
|
||||
echo "Version: $VERSION"
|
||||
echo "Build date: $BUILD_DATE"
|
||||
echo "Source: $SCRIPT_DIR"
|
||||
echo "Output: $OUTPUT_DIR"
|
||||
echo ""
|
||||
|
||||
echo "1. Creating package structure..."
|
||||
mkdir -p "$PACKAGE_DIR"
|
||||
mkdir -p "$PACKAGE_DIR/service/log"
|
||||
|
||||
[ "$(uname)" = "Darwin" ] && export COPYFILE_DISABLE=1
|
||||
|
||||
echo "2. Copying application files..."
|
||||
cp "$SCRIPT_DIR/venus_publisher.py" "$PACKAGE_DIR/"
|
||||
|
||||
echo "3. Copying raymarine_nmea library..."
|
||||
cp -r "$PROJECT_ROOT/raymarine_nmea" "$PACKAGE_DIR/"
|
||||
|
||||
echo "4. Copying service files..."
|
||||
cp "$SCRIPT_DIR/service/run" "$PACKAGE_DIR/service/"
|
||||
cp "$SCRIPT_DIR/service/log/run" "$PACKAGE_DIR/service/log/"
|
||||
|
||||
echo "5. Copying installation scripts..."
|
||||
cp "$SCRIPT_DIR/install.sh" "$PACKAGE_DIR/"
|
||||
cp "$SCRIPT_DIR/uninstall.sh" "$PACKAGE_DIR/"
|
||||
|
||||
echo "6. Copying documentation..."
|
||||
if [ -f "$SCRIPT_DIR/README.md" ]; then
|
||||
cp "$SCRIPT_DIR/README.md" "$PACKAGE_DIR/"
|
||||
fi
|
||||
|
||||
echo "7. Creating version info..."
|
||||
cat > "$PACKAGE_DIR/VERSION" << EOF
|
||||
Package: $PACKAGE_NAME
|
||||
Version: $VERSION
|
||||
Build Date: $BUILD_DATE
|
||||
Build Timestamp: $BUILD_TIMESTAMP
|
||||
|
||||
Installation:
|
||||
1. Copy to Venus OS: scp $PACKAGE_NAME-$VERSION.tar.gz root@<device-ip>:/data/
|
||||
2. SSH to device: ssh root@<device-ip>
|
||||
3. Extract: cd /data && tar -xzf $PACKAGE_NAME-$VERSION.tar.gz
|
||||
4. Install: bash /data/$PACKAGE_NAME/install.sh
|
||||
EOF
|
||||
|
||||
echo "8. Setting permissions..."
|
||||
chmod +x "$PACKAGE_DIR/venus_publisher.py"
|
||||
chmod +x "$PACKAGE_DIR/install.sh"
|
||||
chmod +x "$PACKAGE_DIR/uninstall.sh"
|
||||
chmod +x "$PACKAGE_DIR/service/run"
|
||||
chmod +x "$PACKAGE_DIR/service/log/run"
|
||||
|
||||
mkdir -p "$OUTPUT_DIR"
|
||||
|
||||
TARBALL_NAME="$PACKAGE_NAME-$VERSION.tar.gz"
|
||||
OUTPUT_DIR_ABS="$(cd "$OUTPUT_DIR" && pwd)"
|
||||
TARBALL_PATH="$OUTPUT_DIR_ABS/$TARBALL_NAME"
|
||||
|
||||
echo "9. Creating package archive..."
|
||||
cd "$BUILD_DIR"
|
||||
if [ "$(uname)" = "Darwin" ]; then
|
||||
if command -v xattr >/dev/null 2>&1; then
|
||||
xattr -cr "$PACKAGE_NAME"
|
||||
fi
|
||||
fi
|
||||
tar --format=ustar -czf "$TARBALL_PATH" "$PACKAGE_NAME"
|
||||
|
||||
if command -v sha256sum >/dev/null 2>&1; then
|
||||
CHECKSUM=$(sha256sum "$TARBALL_PATH" | cut -d' ' -f1)
|
||||
else
|
||||
CHECKSUM=$(shasum -a 256 "$TARBALL_PATH" | cut -d' ' -f1)
|
||||
fi
|
||||
echo "$CHECKSUM $TARBALL_NAME" > "$OUTPUT_DIR_ABS/$TARBALL_NAME.sha256"
|
||||
|
||||
echo "10. Cleaning up..."
|
||||
rm -rf "$BUILD_DIR"
|
||||
|
||||
FILE_SIZE=$(du -h "$TARBALL_PATH" | cut -f1)
|
||||
|
||||
echo ""
|
||||
echo "=================================================="
|
||||
echo "Build complete!"
|
||||
echo "=================================================="
|
||||
echo ""
|
||||
echo "Package: $TARBALL_PATH"
|
||||
echo "Size: $FILE_SIZE"
|
||||
echo "SHA256: $CHECKSUM"
|
||||
echo ""
|
||||
echo "Installation on Venus OS:"
|
||||
echo " scp $TARBALL_PATH root@<device-ip>:/data/"
|
||||
echo " ssh root@<device-ip>"
|
||||
echo " cd /data"
|
||||
echo " tar -xzf $TARBALL_NAME"
|
||||
echo " bash /data/$PACKAGE_NAME/install.sh"
|
||||
echo ""
|
||||
237
axiom-nmea/examples/dbus-raymarine-publisher/install.sh
Executable file
237
axiom-nmea/examples/dbus-raymarine-publisher/install.sh
Executable file
@@ -0,0 +1,237 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# Installation script for Raymarine D-Bus Publisher on Venus OS
|
||||
#
|
||||
# Run this on the Venus OS device after copying files to /data/dbus-raymarine-publisher/
|
||||
#
|
||||
# Usage:
|
||||
# chmod +x install.sh
|
||||
# ./install.sh
|
||||
#
|
||||
|
||||
set -e
|
||||
|
||||
INSTALL_DIR="/data/dbus-raymarine-publisher"
|
||||
SERVICE_LINK="dbus-raymarine-publisher"
|
||||
|
||||
# Find velib_python
|
||||
VELIB_DIR=""
|
||||
if [ -d "/opt/victronenergy/velib_python" ]; then
|
||||
VELIB_DIR="/opt/victronenergy/velib_python"
|
||||
else
|
||||
for candidate in \
|
||||
"/opt/victronenergy/dbus-systemcalc-py/ext/velib_python" \
|
||||
"/opt/victronenergy/dbus-generator/ext/velib_python" \
|
||||
"/opt/victronenergy/dbus-mqtt/ext/velib_python" \
|
||||
"/opt/victronenergy/dbus-digitalinputs/ext/velib_python" \
|
||||
"/opt/victronenergy/vrmlogger/ext/velib_python"
|
||||
do
|
||||
if [ -d "$candidate" ] && [ -f "$candidate/vedbus.py" ]; then
|
||||
VELIB_DIR="$candidate"
|
||||
break
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
if [ -z "$VELIB_DIR" ]; then
|
||||
VEDBUS_PATH=$(find /opt/victronenergy -name "vedbus.py" -path "*/velib_python/*" 2>/dev/null | head -1)
|
||||
if [ -n "$VEDBUS_PATH" ]; then
|
||||
VELIB_DIR=$(dirname "$VEDBUS_PATH")
|
||||
fi
|
||||
fi
|
||||
|
||||
# Determine service directory
|
||||
if [ -d "/service" ] && [ ! -L "/service" ]; then
|
||||
SERVICE_DIR="/service"
|
||||
elif [ -d "/opt/victronenergy/service" ]; then
|
||||
SERVICE_DIR="/opt/victronenergy/service"
|
||||
elif [ -L "/service" ]; then
|
||||
SERVICE_DIR=$(readlink -f /service)
|
||||
else
|
||||
SERVICE_DIR="/opt/victronenergy/service"
|
||||
fi
|
||||
|
||||
echo "=================================================="
|
||||
echo "Raymarine D-Bus Publisher - Installation"
|
||||
echo "=================================================="
|
||||
|
||||
if [ ! -d "$SERVICE_DIR" ]; then
|
||||
echo "ERROR: This doesn't appear to be a Venus OS device."
|
||||
echo " Service directory not found."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Detected service directory: $SERVICE_DIR"
|
||||
|
||||
if [ ! -f "$INSTALL_DIR/venus_publisher.py" ]; then
|
||||
echo "ERROR: Installation files not found in $INSTALL_DIR"
|
||||
echo " Please copy all files to $INSTALL_DIR first."
|
||||
exit 1
|
||||
fi
|
||||
if [ ! -f "$INSTALL_DIR/service/run" ]; then
|
||||
echo "ERROR: service/run not found. The package is incomplete."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "1. Making scripts executable..."
|
||||
chmod +x "$INSTALL_DIR/service/run"
|
||||
chmod +x "$INSTALL_DIR/service/log/run"
|
||||
chmod +x "$INSTALL_DIR/venus_publisher.py"
|
||||
|
||||
echo "2. Creating velib_python symlink..."
|
||||
if [ -z "$VELIB_DIR" ]; then
|
||||
echo "ERROR: Could not find velib_python on this system."
|
||||
exit 1
|
||||
fi
|
||||
echo " Found velib_python at: $VELIB_DIR"
|
||||
mkdir -p "$INSTALL_DIR/ext"
|
||||
if [ -L "$INSTALL_DIR/ext/velib_python" ]; then
|
||||
CURRENT_TARGET=$(readlink "$INSTALL_DIR/ext/velib_python")
|
||||
if [ "$CURRENT_TARGET" != "$VELIB_DIR" ]; then
|
||||
echo " Updating symlink (was: $CURRENT_TARGET)"
|
||||
rm "$INSTALL_DIR/ext/velib_python"
|
||||
fi
|
||||
fi
|
||||
if [ ! -L "$INSTALL_DIR/ext/velib_python" ]; then
|
||||
ln -s "$VELIB_DIR" "$INSTALL_DIR/ext/velib_python"
|
||||
echo " Symlink created: $INSTALL_DIR/ext/velib_python -> $VELIB_DIR"
|
||||
else
|
||||
echo " Symlink already exists"
|
||||
fi
|
||||
|
||||
echo "3. Configuring network interface..."
|
||||
echo ""
|
||||
echo " Available network interfaces:"
|
||||
for iface in $(ls /sys/class/net/ 2>/dev/null); do
|
||||
ip=$(ip -4 addr show "$iface" 2>/dev/null | grep 'inet ' | awk '{print $2}' | cut -d/ -f1 | head -n 1)
|
||||
if [ -n "$ip" ]; then
|
||||
echo " $iface: $ip"
|
||||
fi
|
||||
done
|
||||
echo ""
|
||||
echo " Select network interface for Raymarine VLAN:"
|
||||
echo " 1) eth0 - Ethernet (recommended)"
|
||||
echo " 2) wlan0 - WiFi"
|
||||
echo " 3) Enter a specific IP address"
|
||||
echo ""
|
||||
read -p " Choose [1-3]: " -n 1 -r CHOICE
|
||||
echo ""
|
||||
|
||||
case $CHOICE in
|
||||
1) INTERFACE="eth0" ;;
|
||||
2) INTERFACE="wlan0" ;;
|
||||
3)
|
||||
read -p " Enter IP address: " INTERFACE
|
||||
if ! echo "$INTERFACE" | grep -qE '^[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+$'; then
|
||||
echo " Invalid IP address. Using eth0."
|
||||
INTERFACE="eth0"
|
||||
fi
|
||||
;;
|
||||
*) INTERFACE="eth0" ;;
|
||||
esac
|
||||
|
||||
echo " Using interface: $INTERFACE"
|
||||
|
||||
echo "4. Configuring NMEA TCP server..."
|
||||
echo ""
|
||||
echo " The service can run an NMEA 0183 TCP server for navigation apps"
|
||||
echo " (Navionics, iSailor, OpenCPN, SignalK, etc.)"
|
||||
echo ""
|
||||
echo " 1) Enable on default port 10110 (recommended)"
|
||||
echo " 2) Enable on custom port"
|
||||
echo " 3) Disable NMEA TCP server"
|
||||
echo ""
|
||||
read -p " Choose [1-3]: " -n 1 -r TCP_CHOICE
|
||||
echo ""
|
||||
|
||||
case $TCP_CHOICE in
|
||||
1) NMEA_TCP_PORT="10110" ;;
|
||||
2)
|
||||
read -p " Enter TCP port (1024-65535): " NMEA_TCP_PORT
|
||||
if ! echo "$NMEA_TCP_PORT" | grep -qE '^[0-9]+$' || [ "$NMEA_TCP_PORT" -lt 1024 ] || [ "$NMEA_TCP_PORT" -gt 65535 ]; then
|
||||
echo " Invalid port. Using default 10110."
|
||||
NMEA_TCP_PORT="10110"
|
||||
fi
|
||||
;;
|
||||
3) NMEA_TCP_PORT="disabled" ;;
|
||||
*) NMEA_TCP_PORT="10110" ;;
|
||||
esac
|
||||
|
||||
sed -i "s/^INTERFACE=.*/INTERFACE=\"$INTERFACE\"/" "$INSTALL_DIR/service/run"
|
||||
sed -i "s/^NMEA_TCP_PORT=.*/NMEA_TCP_PORT=\"$NMEA_TCP_PORT\"/" "$INSTALL_DIR/service/run"
|
||||
|
||||
echo "5. Creating service symlink..."
|
||||
if [ -L "$SERVICE_DIR/$SERVICE_LINK" ]; then
|
||||
echo " Service link already exists, removing old link..."
|
||||
rm "$SERVICE_DIR/$SERVICE_LINK"
|
||||
fi
|
||||
if [ -e "$SERVICE_DIR/$SERVICE_LINK" ]; then
|
||||
rm -rf "$SERVICE_DIR/$SERVICE_LINK"
|
||||
fi
|
||||
ln -s "$INSTALL_DIR/service" "$SERVICE_DIR/$SERVICE_LINK"
|
||||
|
||||
if [ -L "$SERVICE_DIR/$SERVICE_LINK" ]; then
|
||||
echo " Symlink created: $SERVICE_DIR/$SERVICE_LINK -> $INSTALL_DIR/service"
|
||||
else
|
||||
echo "ERROR: Failed to create service symlink"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "6. Creating log directory..."
|
||||
mkdir -p /var/log/dbus-raymarine-publisher
|
||||
|
||||
echo "7. Setting up rc.local for persistence..."
|
||||
RC_LOCAL="/data/rc.local"
|
||||
if [ ! -f "$RC_LOCAL" ]; then
|
||||
echo "#!/bin/bash" > "$RC_LOCAL"
|
||||
chmod +x "$RC_LOCAL"
|
||||
fi
|
||||
|
||||
if ! grep -q "dbus-raymarine-publisher" "$RC_LOCAL"; then
|
||||
echo "" >> "$RC_LOCAL"
|
||||
echo "# Raymarine D-Bus Publisher" >> "$RC_LOCAL"
|
||||
echo "if [ ! -L $SERVICE_DIR/$SERVICE_LINK ]; then" >> "$RC_LOCAL"
|
||||
echo " ln -s /data/dbus-raymarine-publisher/service $SERVICE_DIR/$SERVICE_LINK" >> "$RC_LOCAL"
|
||||
echo "fi" >> "$RC_LOCAL"
|
||||
echo " Added to rc.local for persistence across firmware updates"
|
||||
else
|
||||
echo " Already in rc.local"
|
||||
fi
|
||||
|
||||
echo "8. Activating service..."
|
||||
sleep 2
|
||||
if command -v svstat >/dev/null 2>&1; then
|
||||
if svstat "$SERVICE_DIR/$SERVICE_LINK" 2>/dev/null | grep -q "up"; then
|
||||
echo " Service is running"
|
||||
else
|
||||
echo " Waiting for service to start..."
|
||||
sleep 3
|
||||
fi
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "=================================================="
|
||||
echo "Installation complete!"
|
||||
echo "=================================================="
|
||||
echo ""
|
||||
|
||||
if command -v svstat >/dev/null 2>&1; then
|
||||
echo "Current status:"
|
||||
svstat "$SERVICE_DIR/$SERVICE_LINK" 2>/dev/null || echo " Service not yet detected by svscan"
|
||||
echo ""
|
||||
fi
|
||||
|
||||
echo "Configuration:"
|
||||
echo " Interface: $INTERFACE"
|
||||
if [ "$NMEA_TCP_PORT" != "disabled" ]; then
|
||||
echo " NMEA TCP port: $NMEA_TCP_PORT"
|
||||
else
|
||||
echo " NMEA TCP: disabled"
|
||||
fi
|
||||
echo ""
|
||||
echo "To check status:"
|
||||
echo " svstat $SERVICE_DIR/$SERVICE_LINK"
|
||||
echo ""
|
||||
echo "To view logs:"
|
||||
echo " tail -F /var/log/dbus-raymarine-publisher/current | tai64nlocal"
|
||||
echo ""
|
||||
2
axiom-nmea/examples/dbus-raymarine-publisher/service/log/run
Executable file
2
axiom-nmea/examples/dbus-raymarine-publisher/service/log/run
Executable file
@@ -0,0 +1,2 @@
|
||||
#!/bin/sh
|
||||
exec multilog t s99999 n8 /var/log/dbus-raymarine-publisher
|
||||
34
axiom-nmea/examples/dbus-raymarine-publisher/service/run
Executable file
34
axiom-nmea/examples/dbus-raymarine-publisher/service/run
Executable file
@@ -0,0 +1,34 @@
|
||||
#!/bin/sh
|
||||
exec 2>&1
|
||||
|
||||
# Configuration - set by install.sh or edit manually
|
||||
# Interface: network interface name (eth0, wlan0) or specific IP address
|
||||
INTERFACE="eth0"
|
||||
# NMEA TCP server port, or "disabled" to turn off
|
||||
NMEA_TCP_PORT="10110"
|
||||
|
||||
INSTALL_DIR="/data/dbus-raymarine-publisher"
|
||||
|
||||
# Resolve interface name to IP address if needed
|
||||
if echo "$INTERFACE" | grep -qE '^[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+$'; then
|
||||
INTERFACE_IP="$INTERFACE"
|
||||
else
|
||||
INTERFACE_IP=$(ip -4 addr show "$INTERFACE" 2>/dev/null | grep 'inet ' | awk '{print $2}' | cut -d/ -f1 | head -n 1)
|
||||
if [ -z "$INTERFACE_IP" ]; then
|
||||
echo "Error: Could not get IP address for interface $INTERFACE"
|
||||
sleep 10
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
cd "$INSTALL_DIR"
|
||||
export PYTHONPATH="$INSTALL_DIR/ext/velib_python:$PYTHONPATH"
|
||||
|
||||
CMD_ARGS="--interface $INTERFACE_IP"
|
||||
if [ -n "$NMEA_TCP_PORT" ] && [ "$NMEA_TCP_PORT" != "disabled" ]; then
|
||||
CMD_ARGS="$CMD_ARGS --nmea-tcp-port $NMEA_TCP_PORT"
|
||||
else
|
||||
CMD_ARGS="$CMD_ARGS --no-nmea-tcp"
|
||||
fi
|
||||
|
||||
exec python3 "$INSTALL_DIR/venus_publisher.py" $CMD_ARGS
|
||||
27
axiom-nmea/examples/dbus-raymarine-publisher/uninstall.sh
Executable file
27
axiom-nmea/examples/dbus-raymarine-publisher/uninstall.sh
Executable file
@@ -0,0 +1,27 @@
|
||||
#!/bin/bash
|
||||
# Uninstall Raymarine D-Bus Publisher for Venus OS
|
||||
|
||||
INSTALL_DIR="/data/dbus-raymarine-publisher"
|
||||
SERVICE_LINK="dbus-raymarine-publisher"
|
||||
|
||||
# Find service directory
|
||||
if [ -d "/service" ] && [ ! -L "/service" ]; then
|
||||
SERVICE_DIR="/service"
|
||||
elif [ -d "/opt/victronenergy/service" ]; then
|
||||
SERVICE_DIR="/opt/victronenergy/service"
|
||||
else
|
||||
SERVICE_DIR="/opt/victronenergy/service"
|
||||
fi
|
||||
|
||||
echo "Uninstalling Raymarine D-Bus Publisher..."
|
||||
|
||||
# Stop and remove service
|
||||
if [ -L "$SERVICE_DIR/$SERVICE_LINK" ] || [ -e "$SERVICE_DIR/$SERVICE_LINK" ]; then
|
||||
echo "Stopping and removing service..."
|
||||
svc -d "$SERVICE_DIR/$SERVICE_LINK" 2>/dev/null || true
|
||||
rm -f "$SERVICE_DIR/$SERVICE_LINK"
|
||||
rm -rf "$SERVICE_DIR/$SERVICE_LINK"
|
||||
fi
|
||||
|
||||
echo "Service removed. Config and data in $INSTALL_DIR are preserved."
|
||||
echo "To remove everything: rm -rf $INSTALL_DIR /var/log/dbus-raymarine-publisher"
|
||||
425
axiom-nmea/examples/dbus-raymarine-publisher/venus_publisher.py
Normal file
425
axiom-nmea/examples/dbus-raymarine-publisher/venus_publisher.py
Normal file
@@ -0,0 +1,425 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Venus OS D-Bus Publisher for Raymarine Sensor Data.
|
||||
|
||||
This script reads sensor data from Raymarine LightHouse multicast
|
||||
and publishes it to Venus OS via D-Bus, making it available to
|
||||
the Victron ecosystem. It also runs an NMEA TCP server on port 10110
|
||||
for integration with navigation apps and charting software.
|
||||
|
||||
Published D-Bus services:
|
||||
- com.victronenergy.gps.raymarine_0
|
||||
GPS position, speed, and course
|
||||
|
||||
- com.victronenergy.meteo.raymarine_0
|
||||
Wind direction and speed, air temperature, barometric pressure
|
||||
|
||||
- com.victronenergy.tank.raymarine_tank{N}_0
|
||||
Tank levels for each configured tank
|
||||
|
||||
- com.victronenergy.battery.raymarine_bat{N}_0
|
||||
Battery voltage for each configured battery
|
||||
|
||||
- com.victronenergy.navigation.raymarine_0
|
||||
Heading, depth, and water temperature
|
||||
|
||||
NMEA TCP Server (port 10110):
|
||||
The NMEA TCP server broadcasts ALL available NMEA 0183 sentences,
|
||||
which includes more data than what Venus OS can display via D-Bus:
|
||||
- GPS: GGA, GLL, RMC
|
||||
- Navigation: HDG, HDT, VTG, VHW
|
||||
- Wind: MWV (apparent & true), MWD
|
||||
- Depth: DPT, DBT
|
||||
- Temperature: MTW, MTA
|
||||
- Transducers: XDR (tanks, batteries, pressure)
|
||||
|
||||
Compatible with Navionics, iSailor, OpenCPN, SignalK, and other
|
||||
NMEA 0183 TCP clients.
|
||||
|
||||
Usage:
|
||||
# Basic usage (listens on 198.18.5.5 VLAN interface)
|
||||
python3 venus_publisher.py
|
||||
|
||||
# Specify interface IP
|
||||
python3 venus_publisher.py --interface 198.18.5.5
|
||||
|
||||
# Enable debug logging
|
||||
python3 venus_publisher.py --debug
|
||||
|
||||
# Disable specific services
|
||||
python3 venus_publisher.py --no-tanks --no-batteries
|
||||
|
||||
# Use a different NMEA TCP port
|
||||
python3 venus_publisher.py --nmea-tcp-port 2000
|
||||
|
||||
# Disable NMEA TCP server (D-Bus only)
|
||||
python3 venus_publisher.py --no-nmea-tcp
|
||||
|
||||
Installation on Venus OS:
|
||||
1. Build: ./build-package.sh
|
||||
2. Copy to device: scp dbus-raymarine-publisher-*.tar.gz root@<device-ip>:/data/
|
||||
3. Extract: cd /data && tar -xzf dbus-raymarine-publisher-*.tar.gz
|
||||
4. Install: bash /data/dbus-raymarine-publisher/install.sh
|
||||
|
||||
Requirements:
|
||||
- Venus OS (or GLib + dbus-python for testing)
|
||||
- Network access to Raymarine LightHouse multicast (VLAN interface)
|
||||
- raymarine-nmea library
|
||||
|
||||
Testing without Venus OS:
|
||||
The script will start but D-Bus services won't register without
|
||||
velib_python. Use --dry-run to test the listener without D-Bus.
|
||||
|
||||
Author: Axiom NMEA Project
|
||||
License: MIT
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import logging
|
||||
import sys
|
||||
import os
|
||||
import time
|
||||
|
||||
# Add script directory to path (raymarine_nmea is bundled alongside this script
|
||||
# in deployed packages, or two levels up in the source tree)
|
||||
_script_dir = os.path.dirname(os.path.abspath(__file__))
|
||||
sys.path.insert(0, _script_dir)
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(_script_dir)))
|
||||
|
||||
from raymarine_nmea import (
|
||||
RaymarineDecoder,
|
||||
SensorData,
|
||||
MulticastListener,
|
||||
NMEATcpServer,
|
||||
TANK_CONFIG,
|
||||
BATTERY_CONFIG,
|
||||
)
|
||||
from raymarine_nmea.venus_dbus import VenusPublisher
|
||||
|
||||
# Configure logging
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
|
||||
handlers=[
|
||||
logging.StreamHandler(sys.stdout),
|
||||
]
|
||||
)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def parse_args():
|
||||
"""Parse command line arguments."""
|
||||
parser = argparse.ArgumentParser(
|
||||
description='Publish Raymarine sensor data to Venus OS D-Bus',
|
||||
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||
epilog=__doc__,
|
||||
)
|
||||
|
||||
# Network settings
|
||||
parser.add_argument(
|
||||
'--interface', '-i',
|
||||
default='198.18.5.5',
|
||||
help='VLAN interface IP for Raymarine multicast (default: 198.18.5.5)',
|
||||
)
|
||||
|
||||
# Service enable/disable
|
||||
parser.add_argument(
|
||||
'--no-gps',
|
||||
action='store_true',
|
||||
help='Disable GPS service',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--no-meteo',
|
||||
action='store_true',
|
||||
help='Disable Meteo (wind) service',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--no-tanks',
|
||||
action='store_true',
|
||||
help='Disable Tank services',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--no-batteries',
|
||||
action='store_true',
|
||||
help='Disable Battery services',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--no-navigation',
|
||||
action='store_true',
|
||||
help='Disable Navigation service (heading, depth, water temp)',
|
||||
)
|
||||
|
||||
# Specific IDs
|
||||
parser.add_argument(
|
||||
'--tank-ids',
|
||||
type=str,
|
||||
help='Comma-separated tank IDs to publish (default: all configured)',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--battery-ids',
|
||||
type=str,
|
||||
help='Comma-separated battery IDs to publish (default: all configured)',
|
||||
)
|
||||
|
||||
# Update interval
|
||||
parser.add_argument(
|
||||
'--update-interval',
|
||||
type=int,
|
||||
default=1000,
|
||||
help='D-Bus update interval in milliseconds (default: 1000)',
|
||||
)
|
||||
|
||||
# NMEA TCP server settings
|
||||
parser.add_argument(
|
||||
'--nmea-tcp-port',
|
||||
type=int,
|
||||
default=10110,
|
||||
help='NMEA TCP server port (default: 10110)',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--no-nmea-tcp',
|
||||
action='store_true',
|
||||
help='Disable NMEA TCP server',
|
||||
)
|
||||
|
||||
# Debugging
|
||||
parser.add_argument(
|
||||
'--debug',
|
||||
action='store_true',
|
||||
help='Enable debug logging',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--dry-run',
|
||||
action='store_true',
|
||||
help='Start listener but don\'t register D-Bus services',
|
||||
)
|
||||
|
||||
return parser.parse_args()
|
||||
|
||||
|
||||
def main():
|
||||
"""Main entry point."""
|
||||
args = parse_args()
|
||||
|
||||
# Configure logging level
|
||||
if args.debug:
|
||||
logging.getLogger().setLevel(logging.DEBUG)
|
||||
logging.getLogger('raymarine_nmea').setLevel(logging.DEBUG)
|
||||
|
||||
# Parse tank and battery IDs if specified
|
||||
tank_ids = None
|
||||
if args.tank_ids:
|
||||
tank_ids = [int(x.strip()) for x in args.tank_ids.split(',')]
|
||||
|
||||
battery_ids = None
|
||||
if args.battery_ids:
|
||||
battery_ids = [int(x.strip()) for x in args.battery_ids.split(',')]
|
||||
|
||||
# Log configuration
|
||||
logger.info("=" * 60)
|
||||
logger.info("Venus OS D-Bus Publisher for Raymarine Sensor Data")
|
||||
logger.info("=" * 60)
|
||||
logger.info(f"Interface IP: {args.interface}")
|
||||
logger.info(f"GPS enabled: {not args.no_gps}")
|
||||
logger.info(f"Meteo enabled: {not args.no_meteo}")
|
||||
logger.info(f"Navigation enabled: {not args.no_navigation}")
|
||||
logger.info(f"Tanks enabled: {not args.no_tanks}")
|
||||
if not args.no_tanks:
|
||||
ids = tank_ids or list(TANK_CONFIG.keys())
|
||||
logger.info(f" Tank IDs: {ids}")
|
||||
logger.info(f"Batteries enabled: {not args.no_batteries}")
|
||||
if not args.no_batteries:
|
||||
ids = battery_ids or list(BATTERY_CONFIG.keys())
|
||||
logger.info(f" Battery IDs: {ids}")
|
||||
logger.info(f"Update interval: {args.update_interval}ms")
|
||||
logger.info(f"NMEA TCP server enabled: {not args.no_nmea_tcp}")
|
||||
if not args.no_nmea_tcp:
|
||||
logger.info(f" NMEA TCP port: {args.nmea_tcp_port}")
|
||||
logger.info("=" * 60)
|
||||
|
||||
# Create components
|
||||
decoder = RaymarineDecoder()
|
||||
sensor_data = SensorData()
|
||||
|
||||
# Callback to log decoded data in debug mode
|
||||
def on_decode(decoded):
|
||||
if args.debug and decoded.has_data():
|
||||
logger.debug(f"Decoded: lat={decoded.latitude}, lon={decoded.longitude}, "
|
||||
f"twd={decoded.twd_deg}, tanks={decoded.tanks}, "
|
||||
f"batteries={decoded.batteries}")
|
||||
|
||||
# Start multicast listener
|
||||
logger.info(f"Starting multicast listener on {args.interface}...")
|
||||
listener = MulticastListener(
|
||||
decoder=decoder,
|
||||
sensor_data=sensor_data,
|
||||
interface_ip=args.interface,
|
||||
on_decode=on_decode if args.debug else None,
|
||||
)
|
||||
listener.start()
|
||||
logger.info("Multicast listener started")
|
||||
|
||||
# Create NMEA TCP server (broadcasts all NMEA sentences, more than D-Bus)
|
||||
nmea_tcp_server = None
|
||||
if not args.no_nmea_tcp:
|
||||
nmea_tcp_server = NMEATcpServer(
|
||||
sensor_data=sensor_data,
|
||||
port=args.nmea_tcp_port,
|
||||
)
|
||||
if nmea_tcp_server.start():
|
||||
logger.info(f"NMEA TCP server started on port {args.nmea_tcp_port}")
|
||||
else:
|
||||
logger.warning("Failed to start NMEA TCP server, continuing without it")
|
||||
nmea_tcp_server = None
|
||||
|
||||
# Dry run mode - just listen and print data
|
||||
if args.dry_run:
|
||||
logger.info("Dry run mode - press Ctrl+C to stop")
|
||||
try:
|
||||
while True:
|
||||
# Broadcast NMEA sentences if TCP server is running
|
||||
if nmea_tcp_server:
|
||||
nmea_tcp_server.broadcast()
|
||||
|
||||
time.sleep(args.update_interval / 1000.0)
|
||||
data = sensor_data.to_dict()
|
||||
tcp_clients = nmea_tcp_server.client_count if nmea_tcp_server else 0
|
||||
logger.info(f"Position: {data['position']}")
|
||||
logger.info(f"Wind: {data['wind']}")
|
||||
logger.info(f"Tanks: {data['tanks']}")
|
||||
logger.info(f"Batteries: {data['batteries']}")
|
||||
logger.info(f"NMEA TCP clients: {tcp_clients}")
|
||||
logger.info("-" * 40)
|
||||
except KeyboardInterrupt:
|
||||
logger.info("Stopping...")
|
||||
if nmea_tcp_server:
|
||||
nmea_tcp_server.stop()
|
||||
listener.stop()
|
||||
return
|
||||
|
||||
# Create Venus publisher
|
||||
logger.info("Creating Venus OS D-Bus publisher...")
|
||||
publisher = VenusPublisher(
|
||||
sensor_data=sensor_data,
|
||||
enable_gps=not args.no_gps,
|
||||
enable_meteo=not args.no_meteo,
|
||||
enable_navigation=not args.no_navigation,
|
||||
enable_tanks=not args.no_tanks,
|
||||
enable_batteries=not args.no_batteries,
|
||||
tank_ids=tank_ids,
|
||||
battery_ids=battery_ids,
|
||||
update_interval_ms=args.update_interval,
|
||||
)
|
||||
|
||||
# Log service status
|
||||
for service in publisher.services:
|
||||
logger.info(f" Service: {service.service_name}")
|
||||
|
||||
# Run publisher with integrated NMEA TCP broadcasting
|
||||
try:
|
||||
logger.info("Starting D-Bus publisher...")
|
||||
_run_with_nmea_tcp(publisher, nmea_tcp_server, args.update_interval)
|
||||
except RuntimeError as e:
|
||||
logger.error(f"Failed to start publisher: {e}")
|
||||
logger.info("Falling back to NMEA TCP only mode (no D-Bus)")
|
||||
|
||||
# Fall back to just NMEA TCP broadcasting
|
||||
try:
|
||||
while True:
|
||||
if nmea_tcp_server:
|
||||
nmea_tcp_server.broadcast()
|
||||
time.sleep(args.update_interval / 1000.0)
|
||||
data = sensor_data.to_dict()
|
||||
tcp_clients = nmea_tcp_server.client_count if nmea_tcp_server else 0
|
||||
logger.info(f"GPS: {data['position']}")
|
||||
logger.info(f"Wind: {data['wind']}")
|
||||
logger.info(f"Stats: {data['stats']}")
|
||||
logger.info(f"NMEA TCP clients: {tcp_clients}")
|
||||
except KeyboardInterrupt:
|
||||
pass
|
||||
finally:
|
||||
logger.info("Stopping services...")
|
||||
if nmea_tcp_server:
|
||||
nmea_tcp_server.stop()
|
||||
listener.stop()
|
||||
logger.info("Shutdown complete")
|
||||
|
||||
|
||||
def _run_with_nmea_tcp(publisher: VenusPublisher, nmea_tcp_server, update_interval_ms: int):
|
||||
"""Run the Venus publisher with integrated NMEA TCP broadcasting.
|
||||
|
||||
This combines D-Bus publishing with NMEA TCP broadcasting in the same
|
||||
GLib main loop, ensuring both are updated at the same interval.
|
||||
|
||||
Args:
|
||||
publisher: VenusPublisher instance
|
||||
nmea_tcp_server: NMEATcpServer instance (or None to skip)
|
||||
update_interval_ms: Update interval in milliseconds
|
||||
"""
|
||||
import signal
|
||||
|
||||
# Try to import GLib
|
||||
try:
|
||||
from gi.repository import GLib
|
||||
except ImportError:
|
||||
raise RuntimeError(
|
||||
"GLib is required to run VenusPublisher. "
|
||||
"Install PyGObject or use --dry-run mode."
|
||||
)
|
||||
|
||||
# Set up D-Bus main loop
|
||||
try:
|
||||
from dbus.mainloop.glib import DBusGMainLoop
|
||||
DBusGMainLoop(set_as_default=True)
|
||||
except ImportError:
|
||||
raise RuntimeError(
|
||||
"dbus-python with GLib support is required. "
|
||||
"Install python3-dbus on Venus OS."
|
||||
)
|
||||
|
||||
# Start D-Bus services
|
||||
if not publisher.start():
|
||||
logger.error("Failed to start VenusPublisher")
|
||||
return
|
||||
|
||||
# Main loop reference for signal handler
|
||||
mainloop = None
|
||||
|
||||
# Set up signal handlers for graceful shutdown
|
||||
def signal_handler(signum, frame):
|
||||
logger.info(f"Received signal {signum}, stopping...")
|
||||
if mainloop:
|
||||
mainloop.quit()
|
||||
|
||||
signal.signal(signal.SIGINT, signal_handler)
|
||||
signal.signal(signal.SIGTERM, signal_handler)
|
||||
|
||||
# Combined update callback: D-Bus + NMEA TCP
|
||||
def update_callback():
|
||||
# Update D-Bus services
|
||||
if not publisher.update():
|
||||
return False
|
||||
|
||||
# Broadcast NMEA sentences to TCP clients
|
||||
if nmea_tcp_server:
|
||||
nmea_tcp_server.broadcast()
|
||||
|
||||
return True
|
||||
|
||||
# Set up periodic updates
|
||||
GLib.timeout_add(update_interval_ms, update_callback)
|
||||
|
||||
# Run main loop
|
||||
logger.info("Publisher running, press Ctrl+C to stop")
|
||||
mainloop = GLib.MainLoop()
|
||||
|
||||
try:
|
||||
mainloop.run()
|
||||
except Exception as e:
|
||||
logger.error(f"Main loop error: {e}")
|
||||
finally:
|
||||
publisher.stop()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
133
axiom-nmea/examples/pcap-to-nmea/pcap_to_nmea.py
Normal file
133
axiom-nmea/examples/pcap-to-nmea/pcap_to_nmea.py
Normal file
@@ -0,0 +1,133 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
PCAP to NMEA Conversion Example
|
||||
|
||||
This example demonstrates how to use the raymarine_nmea library to
|
||||
read Raymarine packets from a PCAP file and generate NMEA sentences.
|
||||
|
||||
Useful for:
|
||||
- Testing without a live Raymarine network
|
||||
- Analyzing captured data
|
||||
- Generating NMEA data for replay
|
||||
|
||||
Usage:
|
||||
python pcap_to_nmea.py capture.pcap
|
||||
|
||||
# Output JSON instead of NMEA
|
||||
python pcap_to_nmea.py capture.pcap --json
|
||||
|
||||
# Output specific sentence types only
|
||||
python pcap_to_nmea.py capture.pcap --sentences gga,mwd,mtw
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
# Add repo root to path for development
|
||||
sys.path.insert(0, str(Path(__file__).resolve().parent.parent.parent))
|
||||
|
||||
from raymarine_nmea import (
|
||||
RaymarineDecoder,
|
||||
SensorData,
|
||||
NMEAGenerator,
|
||||
)
|
||||
from raymarine_nmea.listeners import PcapReader
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Convert Raymarine PCAP to NMEA sentences"
|
||||
)
|
||||
parser.add_argument(
|
||||
'pcap',
|
||||
help='Path to PCAP file'
|
||||
)
|
||||
parser.add_argument(
|
||||
'--json',
|
||||
action='store_true',
|
||||
help='Output JSON instead of NMEA'
|
||||
)
|
||||
parser.add_argument(
|
||||
'--sentences',
|
||||
help='Comma-separated list of sentence types to output'
|
||||
)
|
||||
parser.add_argument(
|
||||
'-v', '--verbose',
|
||||
action='store_true',
|
||||
help='Verbose output (show decode progress)'
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Check file exists
|
||||
if not os.path.exists(args.pcap):
|
||||
print(f"Error: File not found: {args.pcap}")
|
||||
sys.exit(1)
|
||||
|
||||
# Read PCAP
|
||||
print(f"Reading {args.pcap}...", file=sys.stderr)
|
||||
reader = PcapReader(args.pcap)
|
||||
print(f"Found {len(reader)} packets", file=sys.stderr)
|
||||
|
||||
# Decode all packets
|
||||
decoder = RaymarineDecoder(verbose=args.verbose)
|
||||
data = SensorData()
|
||||
|
||||
decoded_count = 0
|
||||
for packet in reader:
|
||||
result = decoder.decode(packet)
|
||||
if result.has_data():
|
||||
decoded_count += 1
|
||||
data.update(result)
|
||||
|
||||
print(f"Decoded {decoded_count} packets with data", file=sys.stderr)
|
||||
|
||||
# Output format
|
||||
if args.json:
|
||||
# JSON output
|
||||
print(json.dumps(data.to_dict(), indent=2))
|
||||
else:
|
||||
# NMEA output
|
||||
generator = NMEAGenerator()
|
||||
|
||||
# Filter sentences if specified
|
||||
if args.sentences:
|
||||
enabled = set(s.strip().lower() for s in args.sentences.split(','))
|
||||
generator.enabled = enabled
|
||||
|
||||
sentences = generator.generate_all(data)
|
||||
|
||||
if sentences:
|
||||
print("\n# Generated NMEA sentences:", file=sys.stderr)
|
||||
for sentence in sentences:
|
||||
print(sentence, end='')
|
||||
else:
|
||||
print("No NMEA sentences generated (no valid data)", file=sys.stderr)
|
||||
|
||||
# Summary
|
||||
print("\n# Summary:", file=sys.stderr)
|
||||
print(f"# Packets: {data.packet_count}", file=sys.stderr)
|
||||
print(f"# Decoded: {data.decode_count}", file=sys.stderr)
|
||||
|
||||
with data._lock:
|
||||
if data.latitude:
|
||||
print(f"# GPS: {data.latitude:.6f}, {data.longitude:.6f}",
|
||||
file=sys.stderr)
|
||||
if data.heading_deg:
|
||||
print(f"# Heading: {data.heading_deg:.1f}°", file=sys.stderr)
|
||||
if data.twd_deg:
|
||||
print(f"# Wind: {data.tws_kts:.1f} kts @ {data.twd_deg:.1f}°",
|
||||
file=sys.stderr)
|
||||
if data.depth_m:
|
||||
print(f"# Depth: {data.depth_m:.1f} m", file=sys.stderr)
|
||||
if data.tanks:
|
||||
print(f"# Tanks: {len(data.tanks)}", file=sys.stderr)
|
||||
if data.batteries:
|
||||
print(f"# Batteries: {len(data.batteries)}", file=sys.stderr)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
79
axiom-nmea/examples/quickstart/quickstart.py
Normal file
79
axiom-nmea/examples/quickstart/quickstart.py
Normal file
@@ -0,0 +1,79 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Raymarine NMEA Library - Quick Start Example
|
||||
|
||||
This is a minimal example showing how to use the library.
|
||||
"""
|
||||
|
||||
import sys
|
||||
import time
|
||||
from pathlib import Path
|
||||
|
||||
# Add repo root to path for development
|
||||
sys.path.insert(0, str(Path(__file__).resolve().parent.parent.parent))
|
||||
|
||||
from raymarine_nmea import (
|
||||
RaymarineDecoder,
|
||||
SensorData,
|
||||
NMEAGenerator,
|
||||
MulticastListener,
|
||||
)
|
||||
|
||||
|
||||
def main():
|
||||
# Check for interface IP argument
|
||||
if len(sys.argv) < 2:
|
||||
print("Usage: python quickstart.py <interface_ip>")
|
||||
print("Example: python quickstart.py 198.18.5.5")
|
||||
sys.exit(1)
|
||||
|
||||
interface_ip = sys.argv[1]
|
||||
|
||||
# Create components
|
||||
decoder = RaymarineDecoder()
|
||||
data = SensorData()
|
||||
generator = NMEAGenerator()
|
||||
|
||||
# Start listening
|
||||
listener = MulticastListener(
|
||||
decoder=decoder,
|
||||
sensor_data=data,
|
||||
interface_ip=interface_ip,
|
||||
)
|
||||
listener.start()
|
||||
|
||||
print(f"Listening on {interface_ip}...")
|
||||
print("Press Ctrl+C to stop\n")
|
||||
|
||||
try:
|
||||
while True:
|
||||
# Generate all available NMEA sentences
|
||||
sentences = generator.generate_all(data)
|
||||
|
||||
# Print each sentence
|
||||
for sentence in sentences:
|
||||
print(sentence, end='')
|
||||
|
||||
if sentences:
|
||||
print() # Blank line between updates
|
||||
|
||||
time.sleep(1)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("\nStopping...")
|
||||
finally:
|
||||
listener.stop()
|
||||
|
||||
# Print final summary
|
||||
print("\nFinal sensor data:")
|
||||
print(f" GPS: {data.latitude}, {data.longitude}")
|
||||
print(f" Heading: {data.heading_deg}")
|
||||
print(f" Wind: {data.tws_kts} kts @ {data.twd_deg}°")
|
||||
print(f" Depth: {data.depth_m} m")
|
||||
print(f" Water temp: {data.water_temp_c}°C")
|
||||
print(f" Tanks: {data.tanks}")
|
||||
print(f" Batteries: {data.batteries}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
184
axiom-nmea/examples/sensor-monitor/sensor_monitor.py
Normal file
184
axiom-nmea/examples/sensor-monitor/sensor_monitor.py
Normal file
@@ -0,0 +1,184 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Raymarine Sensor Update Monitor
|
||||
|
||||
Displays real-time frequency of updates from each sensor type on the Raymarine network.
|
||||
Useful for diagnosing gaps or inconsistent data delivery.
|
||||
|
||||
Usage:
|
||||
python sensor_monitor.py -i 198.18.5.5
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import logging
|
||||
import sys
|
||||
import time
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
|
||||
# Add repo root to path for development
|
||||
sys.path.insert(0, str(Path(__file__).resolve().parent.parent.parent))
|
||||
|
||||
from raymarine_nmea import RaymarineDecoder, MulticastListener, SensorData
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(description="Monitor Raymarine sensor update frequency")
|
||||
parser.add_argument('-i', '--interface', required=True,
|
||||
help='Interface IP for Raymarine multicast (e.g., 198.18.5.5)')
|
||||
parser.add_argument('--interval', type=float, default=1.0,
|
||||
help='Display refresh interval in seconds (default: 1.0)')
|
||||
parser.add_argument('--debug', action='store_true',
|
||||
help='Enable debug logging to see raw packets')
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Enable debug logging if requested
|
||||
if args.debug:
|
||||
logging.basicConfig(
|
||||
level=logging.DEBUG,
|
||||
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
|
||||
)
|
||||
else:
|
||||
logging.basicConfig(level=logging.WARNING)
|
||||
|
||||
sensor_data = SensorData()
|
||||
decoder = RaymarineDecoder()
|
||||
|
||||
listener = MulticastListener(
|
||||
decoder=decoder,
|
||||
sensor_data=sensor_data,
|
||||
interface_ip=args.interface,
|
||||
)
|
||||
|
||||
print(f"Starting Raymarine sensor monitor on interface {args.interface}...")
|
||||
if args.debug:
|
||||
print("Debug mode enabled - check logs for packet details")
|
||||
listener.start()
|
||||
|
||||
# Track update times for each sensor
|
||||
last_values = {}
|
||||
update_counts = {}
|
||||
last_update_time = {}
|
||||
max_gaps = {}
|
||||
start_time = time.time()
|
||||
|
||||
sensor_fields = [
|
||||
('GPS', 'gps', lambda d: (d.latitude, d.longitude)),
|
||||
('Heading', 'heading', lambda d: d.heading_deg),
|
||||
('COG', 'heading', lambda d: d.cog_deg),
|
||||
('SOG', 'heading', lambda d: d.sog_kts),
|
||||
('Depth', 'depth', lambda d: d.depth_m),
|
||||
('Water Temp', 'temp', lambda d: d.water_temp_c),
|
||||
('Air Temp', 'temp', lambda d: d.air_temp_c),
|
||||
('Wind (Apparent)', 'wind', lambda d: (d.awa_deg, d.aws_kts)),
|
||||
('Wind (True)', 'wind', lambda d: (d.twd_deg, d.tws_kts)),
|
||||
('Pressure', 'pressure', lambda d: d.pressure_mbar),
|
||||
('Tanks', 'tank', lambda d: dict(d.tanks) if d.tanks else None),
|
||||
('Batteries', 'battery', lambda d: dict(d.batteries) if d.batteries else None),
|
||||
]
|
||||
|
||||
for name, _, _ in sensor_fields:
|
||||
update_counts[name] = 0
|
||||
last_update_time[name] = None
|
||||
max_gaps[name] = 0
|
||||
last_values[name] = None
|
||||
|
||||
try:
|
||||
while True:
|
||||
if not args.debug:
|
||||
# Clear screen only in non-debug mode
|
||||
print("\033[2J\033[H", end="")
|
||||
|
||||
print("=" * 80)
|
||||
print(f"RAYMARINE SENSOR UPDATE MONITOR - {datetime.now().strftime('%H:%M:%S')}")
|
||||
print("=" * 80)
|
||||
|
||||
elapsed = time.time() - start_time
|
||||
print(f"Monitoring for: {elapsed:.1f}s")
|
||||
print(f"Packets: {sensor_data.packet_count} | Decoded: {sensor_data.decode_count}")
|
||||
print()
|
||||
|
||||
print(f"{'Sensor':<18} {'Value':<25} {'Age':>8} {'Count':>7} {'Avg':>8} {'MaxGap':>8}")
|
||||
print("-" * 80)
|
||||
|
||||
# Get age mapping (must be outside lock since get_age() also locks)
|
||||
age_values = {}
|
||||
for age_type in ['gps', 'heading', 'depth', 'temp', 'wind', 'pressure', 'tank', 'battery']:
|
||||
age_values[age_type] = sensor_data.get_age(age_type)
|
||||
|
||||
with sensor_data._lock:
|
||||
for name, age_type, getter in sensor_fields:
|
||||
try:
|
||||
value = getter(sensor_data)
|
||||
except Exception:
|
||||
value = None
|
||||
|
||||
# Check if value changed
|
||||
if value != last_values[name] and value is not None:
|
||||
now = time.time()
|
||||
if last_update_time[name] is not None:
|
||||
gap = now - last_update_time[name]
|
||||
if gap > max_gaps[name]:
|
||||
max_gaps[name] = gap
|
||||
last_update_time[name] = now
|
||||
update_counts[name] += 1
|
||||
last_values[name] = value
|
||||
|
||||
# Get age (from pre-fetched values)
|
||||
age = age_values.get(age_type)
|
||||
|
||||
# Format value for display
|
||||
if value is None:
|
||||
val_str = "-"
|
||||
elif isinstance(value, tuple):
|
||||
parts = [f"{v:.1f}" if v is not None else "-" for v in value]
|
||||
val_str = ", ".join(parts)
|
||||
elif isinstance(value, dict):
|
||||
val_str = f"{len(value)} items"
|
||||
elif isinstance(value, float):
|
||||
val_str = f"{value:.2f}"
|
||||
else:
|
||||
val_str = str(value)[:25]
|
||||
|
||||
# Truncate value string
|
||||
if len(val_str) > 25:
|
||||
val_str = val_str[:22] + "..."
|
||||
|
||||
# Color based on age
|
||||
if age is None:
|
||||
color = "\033[90m" # Gray
|
||||
age_str = "-"
|
||||
elif age > 5:
|
||||
color = "\033[91m" # Red
|
||||
age_str = f"{age:.1f}s"
|
||||
elif age > 2:
|
||||
color = "\033[93m" # Yellow
|
||||
age_str = f"{age:.1f}s"
|
||||
else:
|
||||
color = "\033[92m" # Green
|
||||
age_str = f"{age:.1f}s"
|
||||
|
||||
reset = "\033[0m"
|
||||
|
||||
count = update_counts[name]
|
||||
avg_str = f"{count/elapsed:.2f}/s" if elapsed > 0 and count > 0 else "-"
|
||||
max_gap_str = f"{max_gaps[name]:.1f}s" if max_gaps[name] > 0 else "-"
|
||||
|
||||
print(f"{color}{name:<18} {val_str:<25} {age_str:>8} {count:>7} {avg_str:>8} {max_gap_str:>8}{reset}")
|
||||
|
||||
print()
|
||||
print("=" * 80)
|
||||
print("Press Ctrl+C to exit")
|
||||
if args.debug:
|
||||
print("\n--- Debug output follows ---\n")
|
||||
|
||||
time.sleep(args.interval)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("\nStopping monitor...")
|
||||
listener.stop()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
21
axiom-nmea/nmea-server/.env.example
Normal file
21
axiom-nmea/nmea-server/.env.example
Normal file
@@ -0,0 +1,21 @@
|
||||
# NMEA Server Configuration
|
||||
# Copy this file to .env and modify as needed
|
||||
|
||||
# Required: IP address of the interface connected to Raymarine network
|
||||
# This is where multicast data is received from
|
||||
# Find your interface IP with: ip addr show | grep "inet "
|
||||
RAYMARINE_INTERFACE=198.18.5.5
|
||||
|
||||
# IP address to bind NMEA server (default: 0.0.0.0 = all interfaces)
|
||||
# Use a specific IP to expose NMEA on a different interface than Raymarine
|
||||
# Example: Receive from Raymarine on 198.18.5.5, serve NMEA on 198.18.10.62
|
||||
NMEA_HOST=0.0.0.0
|
||||
|
||||
# NMEA TCP server port (default: 10110)
|
||||
NMEA_PORT=10110
|
||||
|
||||
# Update interval in seconds (default: 1.0)
|
||||
UPDATE_INTERVAL=1.0
|
||||
|
||||
# Logging level: DEBUG, INFO, WARNING, ERROR
|
||||
LOG_LEVEL=INFO
|
||||
36
axiom-nmea/nmea-server/Dockerfile
Normal file
36
axiom-nmea/nmea-server/Dockerfile
Normal file
@@ -0,0 +1,36 @@
|
||||
FROM python:3.11-slim
|
||||
|
||||
LABEL maintainer="Axiom NMEA Project"
|
||||
LABEL description="NMEA TCP Server for Raymarine LightHouse protocol"
|
||||
|
||||
# Set working directory
|
||||
WORKDIR /app
|
||||
|
||||
# Copy the package files for installation
|
||||
COPY pyproject.toml /app/
|
||||
COPY raymarine_nmea/ /app/raymarine_nmea/
|
||||
|
||||
# Install the package
|
||||
RUN pip install --no-cache-dir -e .
|
||||
|
||||
# Copy the server
|
||||
COPY nmea-server/server.py /app/
|
||||
|
||||
# Create non-root user for security
|
||||
RUN useradd --create-home --shell /bin/bash nmea && \
|
||||
chown -R nmea:nmea /app
|
||||
|
||||
USER nmea
|
||||
|
||||
# Default environment variables
|
||||
ENV RAYMARINE_INTERFACE=""
|
||||
ENV NMEA_HOST=0.0.0.0
|
||||
ENV NMEA_PORT=10110
|
||||
ENV UPDATE_INTERVAL=1.0
|
||||
ENV LOG_LEVEL=INFO
|
||||
|
||||
# Expose the NMEA port
|
||||
EXPOSE 10110
|
||||
|
||||
# Run the server
|
||||
ENTRYPOINT ["python", "-u", "/app/server.py"]
|
||||
52
axiom-nmea/nmea-server/docker-compose.yml
Normal file
52
axiom-nmea/nmea-server/docker-compose.yml
Normal file
@@ -0,0 +1,52 @@
|
||||
services:
|
||||
nmea-server:
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: nmea-server/Dockerfile
|
||||
container_name: nmea-server
|
||||
restart: unless-stopped
|
||||
|
||||
# Host network mode required for multicast reception
|
||||
network_mode: host
|
||||
|
||||
# Network capabilities required for proper socket handling
|
||||
cap_add:
|
||||
- NET_ADMIN
|
||||
- NET_RAW
|
||||
|
||||
environment:
|
||||
# Required: IP address of the interface connected to Raymarine network
|
||||
# This is where multicast data is received from
|
||||
# Find with: ip addr show | grep "inet "
|
||||
RAYMARINE_INTERFACE: ${RAYMARINE_INTERFACE:?Set RAYMARINE_INTERFACE to your network interface IP}
|
||||
|
||||
# IP address to bind NMEA server (default: 0.0.0.0 = all interfaces)
|
||||
# Use a specific IP to expose NMEA on a different interface than Raymarine
|
||||
NMEA_HOST: ${NMEA_HOST:-0.0.0.0}
|
||||
|
||||
# NMEA TCP server port (default: 10110)
|
||||
NMEA_PORT: ${NMEA_PORT:-10110}
|
||||
|
||||
# NMEA UDP broadcast port (optional - set to enable UDP)
|
||||
NMEA_UDP_PORT: ${NMEA_UDP_PORT:-}
|
||||
|
||||
# NMEA UDP destination IP (default: 255.255.255.255 broadcast)
|
||||
NMEA_UDP_DEST: ${NMEA_UDP_DEST:-255.255.255.255}
|
||||
|
||||
# Update interval in seconds (default: 1.0)
|
||||
UPDATE_INTERVAL: ${UPDATE_INTERVAL:-1.0}
|
||||
|
||||
# Logging level: DEBUG, INFO, WARNING, ERROR
|
||||
LOG_LEVEL: ${LOG_LEVEL:-INFO}
|
||||
|
||||
# Note: ports mapping not used with host network mode
|
||||
# The server listens on NMEA_PORT (default 10110) directly on host
|
||||
|
||||
logging:
|
||||
driver: json-file
|
||||
options:
|
||||
max-size: "10m"
|
||||
max-file: "3"
|
||||
|
||||
# Ensure clean shutdown
|
||||
stop_grace_period: 10s
|
||||
249
axiom-nmea/nmea-server/server.py
Normal file
249
axiom-nmea/nmea-server/server.py
Normal file
@@ -0,0 +1,249 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
NMEA Data Server - Docker Daemon
|
||||
|
||||
Simple TCP/UDP server that broadcasts NMEA sentences from Raymarine data.
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import logging
|
||||
import os
|
||||
import signal
|
||||
import socket
|
||||
import sys
|
||||
import threading
|
||||
import time
|
||||
from pathlib import Path
|
||||
from typing import Dict, Optional
|
||||
|
||||
# Add repo root to path for development (Docker installs via pip)
|
||||
sys.path.insert(0, str(Path(__file__).resolve().parent.parent))
|
||||
|
||||
from raymarine_nmea import (
|
||||
RaymarineDecoder,
|
||||
SensorData,
|
||||
MulticastListener,
|
||||
NMEAGenerator,
|
||||
)
|
||||
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format='%(asctime)s - %(levelname)s - %(message)s',
|
||||
datefmt='%Y-%m-%d %H:%M:%S'
|
||||
)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class NMEAServer:
|
||||
"""TCP/UDP server for NMEA broadcast."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
host: str = '0.0.0.0',
|
||||
tcp_port: int = 10110,
|
||||
udp_port: Optional[int] = None,
|
||||
udp_dest: str = '255.255.255.255',
|
||||
interval: float = 1.0
|
||||
):
|
||||
self.host = host
|
||||
self.tcp_port = tcp_port
|
||||
self.udp_port = udp_port
|
||||
self.udp_dest = udp_dest
|
||||
self.interval = interval
|
||||
|
||||
self.sensor_data = SensorData()
|
||||
# Enable all available NMEA sentences
|
||||
self.generator = NMEAGenerator(
|
||||
enabled_sentences={
|
||||
# GPS
|
||||
'gga', 'gll', 'rmc',
|
||||
# Navigation
|
||||
'hdg', 'hdt', 'vtg', 'vhw',
|
||||
# Wind
|
||||
'mwv_apparent', 'mwv_true', 'mwd',
|
||||
# Depth
|
||||
'dpt', 'dbt',
|
||||
# Temperature
|
||||
'mtw', 'mta',
|
||||
# Transducers (tanks, batteries, pressure)
|
||||
'xdr_tanks', 'xdr_batteries', 'xdr_pressure',
|
||||
}
|
||||
)
|
||||
|
||||
self.server_socket: Optional[socket.socket] = None
|
||||
self.udp_socket: Optional[socket.socket] = None
|
||||
self.clients: Dict[socket.socket, str] = {}
|
||||
self.clients_lock = threading.Lock()
|
||||
self.running = False
|
||||
|
||||
def accept_loop(self):
|
||||
"""Accept new TCP client connections."""
|
||||
while self.running:
|
||||
try:
|
||||
client, addr = self.server_socket.accept()
|
||||
addr_str = f"{addr[0]}:{addr[1]}"
|
||||
client.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
|
||||
|
||||
with self.clients_lock:
|
||||
self.clients[client] = addr_str
|
||||
count = len(self.clients)
|
||||
|
||||
logger.info(f"TCP client connected: {addr_str} (total: {count})")
|
||||
|
||||
except socket.timeout:
|
||||
continue
|
||||
except OSError as e:
|
||||
if self.running:
|
||||
logger.debug(f"Accept error: {e}")
|
||||
|
||||
def broadcast(self, data: bytes):
|
||||
"""Send data to all TCP clients and UDP."""
|
||||
# TCP broadcast
|
||||
with self.clients_lock:
|
||||
dead = []
|
||||
for client, addr in list(self.clients.items()):
|
||||
try:
|
||||
client.sendall(data)
|
||||
except Exception as e:
|
||||
logger.info(f"TCP client {addr} error: {e}")
|
||||
dead.append(client)
|
||||
|
||||
for client in dead:
|
||||
addr = self.clients.pop(client, "unknown")
|
||||
try:
|
||||
client.close()
|
||||
except Exception:
|
||||
pass
|
||||
logger.info(f"TCP client {addr} removed (total: {len(self.clients)})")
|
||||
|
||||
# UDP broadcast
|
||||
if self.udp_socket and self.udp_port:
|
||||
try:
|
||||
self.udp_socket.sendto(data, (self.udp_dest, self.udp_port))
|
||||
except Exception as e:
|
||||
logger.debug(f"UDP send error: {e}")
|
||||
|
||||
def run(self, interface_ip: str):
|
||||
"""Run the server."""
|
||||
self.running = True
|
||||
|
||||
# Start Raymarine listener
|
||||
decoder = RaymarineDecoder()
|
||||
listener = MulticastListener(
|
||||
decoder=decoder,
|
||||
sensor_data=self.sensor_data,
|
||||
interface_ip=interface_ip,
|
||||
)
|
||||
listener.start()
|
||||
|
||||
# Create TCP server socket
|
||||
self.server_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
self.server_socket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
self.server_socket.settimeout(1.0)
|
||||
self.server_socket.bind((self.host, self.tcp_port))
|
||||
self.server_socket.listen(5)
|
||||
|
||||
# Create UDP socket if enabled
|
||||
if self.udp_port:
|
||||
self.udp_socket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
|
||||
self.udp_socket.setsockopt(socket.SOL_SOCKET, socket.SO_BROADCAST, 1)
|
||||
logger.info(f"UDP broadcast enabled: {self.udp_dest}:{self.udp_port}")
|
||||
|
||||
# Start TCP accept thread
|
||||
accept_thread = threading.Thread(target=self.accept_loop, daemon=True)
|
||||
accept_thread.start()
|
||||
|
||||
logger.info(f"TCP server on {self.host}:{self.tcp_port}")
|
||||
logger.info(f"Raymarine interface: {interface_ip}")
|
||||
|
||||
last_log = 0
|
||||
|
||||
try:
|
||||
while self.running:
|
||||
sentences = self.generator.generate_all(self.sensor_data)
|
||||
if sentences:
|
||||
data = ''.join(sentences).encode('ascii')
|
||||
self.broadcast(data)
|
||||
for s in sentences:
|
||||
logger.info(f"TX: {s.strip()}")
|
||||
else:
|
||||
logger.warning("No NMEA data this cycle")
|
||||
|
||||
now = time.time()
|
||||
if now - last_log >= 30:
|
||||
with self.sensor_data._lock:
|
||||
lat = self.sensor_data.latitude
|
||||
lon = self.sensor_data.longitude
|
||||
if lat and lon:
|
||||
logger.info(f"GPS: {lat:.6f}, {lon:.6f} | TCP clients: {len(self.clients)}")
|
||||
last_log = now
|
||||
|
||||
time.sleep(self.interval)
|
||||
|
||||
finally:
|
||||
self.running = False
|
||||
listener.stop()
|
||||
self.server_socket.close()
|
||||
if self.udp_socket:
|
||||
self.udp_socket.close()
|
||||
with self.clients_lock:
|
||||
for c in self.clients:
|
||||
try:
|
||||
c.close()
|
||||
except Exception:
|
||||
pass
|
||||
logger.info("Server stopped")
|
||||
|
||||
|
||||
_server: Optional[NMEAServer] = None
|
||||
|
||||
|
||||
def signal_handler(signum, frame):
|
||||
if _server:
|
||||
_server.running = False
|
||||
|
||||
|
||||
def main():
|
||||
global _server
|
||||
|
||||
parser = argparse.ArgumentParser(description="NMEA server")
|
||||
parser.add_argument('-i', '--interface', default=os.environ.get('RAYMARINE_INTERFACE'))
|
||||
parser.add_argument('-H', '--host', default=os.environ.get('NMEA_HOST', '0.0.0.0'))
|
||||
parser.add_argument('-p', '--port', type=int, default=int(os.environ.get('NMEA_PORT', '10110')),
|
||||
help='TCP port (default: 10110)')
|
||||
parser.add_argument('--udp-port', type=int, default=os.environ.get('NMEA_UDP_PORT'),
|
||||
help='UDP port for broadcast (optional)')
|
||||
parser.add_argument('--udp-dest', default=os.environ.get('NMEA_UDP_DEST', '255.255.255.255'),
|
||||
help='UDP destination IP (default: 255.255.255.255 broadcast)')
|
||||
parser.add_argument('--interval', type=float, default=float(os.environ.get('UPDATE_INTERVAL', '1.0')))
|
||||
parser.add_argument('--log-level', default=os.environ.get('LOG_LEVEL', 'INFO'))
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
if not args.interface:
|
||||
parser.error("Interface IP required (-i or RAYMARINE_INTERFACE)")
|
||||
|
||||
logging.getLogger().setLevel(getattr(logging, args.log_level))
|
||||
|
||||
signal.signal(signal.SIGTERM, signal_handler)
|
||||
signal.signal(signal.SIGINT, signal_handler)
|
||||
|
||||
_server = NMEAServer(
|
||||
host=args.host,
|
||||
tcp_port=args.port,
|
||||
udp_port=args.udp_port,
|
||||
udp_dest=args.udp_dest,
|
||||
interval=args.interval
|
||||
)
|
||||
|
||||
try:
|
||||
_server.run(args.interface)
|
||||
except KeyboardInterrupt:
|
||||
pass
|
||||
except Exception as e:
|
||||
logger.error(f"Fatal: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
38
axiom-nmea/pyproject.toml
Normal file
38
axiom-nmea/pyproject.toml
Normal file
@@ -0,0 +1,38 @@
|
||||
[build-system]
|
||||
requires = ["setuptools>=61.0"]
|
||||
build-backend = "setuptools.build_meta"
|
||||
|
||||
[project]
|
||||
name = "raymarine-nmea"
|
||||
version = "1.0.0"
|
||||
description = "Decode Raymarine LightHouse protocol data and convert to NMEA 0183 sentences"
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.8"
|
||||
license = {text = "MIT"}
|
||||
authors = [
|
||||
{name = "Axiom NMEA Project"}
|
||||
]
|
||||
keywords = ["raymarine", "nmea", "marine", "navigation", "gps", "sailing"]
|
||||
classifiers = [
|
||||
"Development Status :: 4 - Beta",
|
||||
"Intended Audience :: Developers",
|
||||
"License :: OSI Approved :: MIT License",
|
||||
"Operating System :: OS Independent",
|
||||
"Programming Language :: Python :: 3",
|
||||
"Programming Language :: Python :: 3.8",
|
||||
"Programming Language :: Python :: 3.9",
|
||||
"Programming Language :: Python :: 3.10",
|
||||
"Programming Language :: Python :: 3.11",
|
||||
"Programming Language :: Python :: 3.12",
|
||||
"Topic :: Scientific/Engineering :: GIS",
|
||||
]
|
||||
|
||||
# No external dependencies - uses only Python standard library
|
||||
dependencies = []
|
||||
|
||||
[project.urls]
|
||||
Homepage = "https://github.com/terbonium/axiom-nmea"
|
||||
Repository = "https://github.com/terbonium/axiom-nmea"
|
||||
|
||||
[tool.setuptools.packages.find]
|
||||
include = ["raymarine_nmea*"]
|
||||
100
axiom-nmea/raymarine_nmea/__init__.py
Normal file
100
axiom-nmea/raymarine_nmea/__init__.py
Normal file
@@ -0,0 +1,100 @@
|
||||
"""
|
||||
Raymarine NMEA Library
|
||||
|
||||
A Python library for decoding Raymarine LightHouse protocol data
|
||||
and converting it to standard NMEA 0183 sentences.
|
||||
|
||||
Example usage:
|
||||
from raymarine_nmea import RaymarineDecoder, NMEAGenerator
|
||||
|
||||
# Create decoder and generator
|
||||
decoder = RaymarineDecoder()
|
||||
generator = NMEAGenerator()
|
||||
|
||||
# Decode a packet (from multicast or PCAP)
|
||||
sensor_data = decoder.decode(packet_bytes)
|
||||
|
||||
# Generate NMEA sentences
|
||||
sentences = generator.generate_all(sensor_data)
|
||||
for sentence in sentences:
|
||||
print(sentence)
|
||||
"""
|
||||
|
||||
__version__ = "1.0.0"
|
||||
__author__ = "Axiom NMEA Project"
|
||||
|
||||
# Core protocol components
|
||||
from .protocol.parser import ProtobufParser
|
||||
from .protocol.decoder import RaymarineDecoder
|
||||
from .protocol.constants import (
|
||||
WIRE_VARINT,
|
||||
WIRE_FIXED64,
|
||||
WIRE_LENGTH,
|
||||
WIRE_FIXED32,
|
||||
RAD_TO_DEG,
|
||||
MS_TO_KTS,
|
||||
FEET_TO_M,
|
||||
KELVIN_OFFSET,
|
||||
)
|
||||
|
||||
# Data models
|
||||
from .data.store import SensorData
|
||||
|
||||
# NMEA generation
|
||||
from .nmea.generator import NMEAGenerator
|
||||
from .nmea.sentence import NMEASentence
|
||||
from .nmea.server import NMEATcpServer
|
||||
|
||||
# Listeners
|
||||
from .listeners.multicast import MulticastListener
|
||||
from .listeners.pcap import PcapReader
|
||||
|
||||
# Sensor configurations
|
||||
from .sensors import (
|
||||
TANK_CONFIG,
|
||||
BATTERY_CONFIG,
|
||||
MULTICAST_GROUPS,
|
||||
MULTICAST_GROUPS_ALL,
|
||||
)
|
||||
|
||||
# Venus OS D-Bus publishing (optional import - only works on Venus OS)
|
||||
try:
|
||||
from . import venus_dbus
|
||||
HAS_VENUS_DBUS = True
|
||||
except ImportError:
|
||||
venus_dbus = None
|
||||
HAS_VENUS_DBUS = False
|
||||
|
||||
__all__ = [
|
||||
# Version
|
||||
"__version__",
|
||||
# Protocol
|
||||
"ProtobufParser",
|
||||
"RaymarineDecoder",
|
||||
# Constants
|
||||
"WIRE_VARINT",
|
||||
"WIRE_FIXED64",
|
||||
"WIRE_LENGTH",
|
||||
"WIRE_FIXED32",
|
||||
"RAD_TO_DEG",
|
||||
"MS_TO_KTS",
|
||||
"FEET_TO_M",
|
||||
"KELVIN_OFFSET",
|
||||
# Data
|
||||
"SensorData",
|
||||
# NMEA
|
||||
"NMEAGenerator",
|
||||
"NMEASentence",
|
||||
"NMEATcpServer",
|
||||
# Listeners
|
||||
"MulticastListener",
|
||||
"PcapReader",
|
||||
# Config
|
||||
"TANK_CONFIG",
|
||||
"BATTERY_CONFIG",
|
||||
"MULTICAST_GROUPS",
|
||||
"MULTICAST_GROUPS_ALL",
|
||||
# Venus OS
|
||||
"venus_dbus",
|
||||
"HAS_VENUS_DBUS",
|
||||
]
|
||||
7
axiom-nmea/raymarine_nmea/data/__init__.py
Normal file
7
axiom-nmea/raymarine_nmea/data/__init__.py
Normal file
@@ -0,0 +1,7 @@
|
||||
"""
|
||||
Data storage module.
|
||||
"""
|
||||
|
||||
from .store import SensorData
|
||||
|
||||
__all__ = ["SensorData"]
|
||||
327
axiom-nmea/raymarine_nmea/data/store.py
Normal file
327
axiom-nmea/raymarine_nmea/data/store.py
Normal file
@@ -0,0 +1,327 @@
|
||||
"""
|
||||
Thread-safe sensor data storage.
|
||||
|
||||
This module provides a thread-safe container for aggregating sensor data
|
||||
from multiple decoded packets over time.
|
||||
"""
|
||||
|
||||
import threading
|
||||
import time
|
||||
from dataclasses import dataclass, field
|
||||
from datetime import datetime
|
||||
from typing import Dict, Any, Optional
|
||||
|
||||
from ..sensors import (
|
||||
get_tank_name,
|
||||
get_tank_capacity,
|
||||
get_battery_name,
|
||||
)
|
||||
from ..protocol.constants import FEET_TO_M
|
||||
|
||||
|
||||
@dataclass
|
||||
class SensorData:
|
||||
"""Thread-safe container for current sensor readings.
|
||||
|
||||
This class aggregates data from multiple decoded packets and provides
|
||||
thread-safe access for concurrent updates and reads.
|
||||
|
||||
Example:
|
||||
data = SensorData()
|
||||
decoder = RaymarineDecoder()
|
||||
|
||||
# Update from decoded packet
|
||||
result = decoder.decode(packet)
|
||||
data.update(result)
|
||||
|
||||
# Read current values
|
||||
print(f"GPS: {data.latitude}, {data.longitude}")
|
||||
print(f"Heading: {data.heading_deg}°")
|
||||
"""
|
||||
|
||||
# Position
|
||||
latitude: Optional[float] = None
|
||||
longitude: Optional[float] = None
|
||||
|
||||
# Navigation
|
||||
heading_deg: Optional[float] = None
|
||||
cog_deg: Optional[float] = None
|
||||
sog_kts: Optional[float] = None
|
||||
|
||||
# Wind
|
||||
twd_deg: Optional[float] = None # True Wind Direction
|
||||
tws_kts: Optional[float] = None # True Wind Speed
|
||||
awa_deg: Optional[float] = None # Apparent Wind Angle
|
||||
aws_kts: Optional[float] = None # Apparent Wind Speed
|
||||
|
||||
# Depth (stored in meters internally)
|
||||
depth_m: Optional[float] = None
|
||||
|
||||
# Temperature
|
||||
water_temp_c: Optional[float] = None
|
||||
air_temp_c: Optional[float] = None
|
||||
|
||||
# Barometric pressure (stored in mbar internally)
|
||||
pressure_mbar: Optional[float] = None
|
||||
|
||||
# Tanks: dict of tank_id -> level percentage
|
||||
tanks: Dict[int, float] = field(default_factory=dict)
|
||||
|
||||
# Batteries: dict of battery_id -> voltage
|
||||
batteries: Dict[int, float] = field(default_factory=dict)
|
||||
|
||||
# Timestamps for each data type (Unix timestamp)
|
||||
gps_time: float = 0
|
||||
heading_time: float = 0
|
||||
wind_time: float = 0
|
||||
depth_time: float = 0
|
||||
temp_time: float = 0
|
||||
pressure_time: float = 0
|
||||
tank_time: float = 0
|
||||
battery_time: float = 0
|
||||
|
||||
# Statistics
|
||||
packet_count: int = 0
|
||||
decode_count: int = 0
|
||||
start_time: float = field(default_factory=time.time)
|
||||
|
||||
# Thread safety
|
||||
_lock: threading.Lock = field(default_factory=threading.Lock)
|
||||
|
||||
@property
|
||||
def depth_ft(self) -> Optional[float]:
|
||||
"""Get depth in feet."""
|
||||
if self.depth_m is None:
|
||||
return None
|
||||
return self.depth_m / FEET_TO_M
|
||||
|
||||
@property
|
||||
def water_temp_f(self) -> Optional[float]:
|
||||
"""Get water temperature in Fahrenheit."""
|
||||
if self.water_temp_c is None:
|
||||
return None
|
||||
return self.water_temp_c * 9/5 + 32
|
||||
|
||||
@property
|
||||
def air_temp_f(self) -> Optional[float]:
|
||||
"""Get air temperature in Fahrenheit."""
|
||||
if self.air_temp_c is None:
|
||||
return None
|
||||
return self.air_temp_c * 9/5 + 32
|
||||
|
||||
@property
|
||||
def pressure_inhg(self) -> Optional[float]:
|
||||
"""Get barometric pressure in inches of mercury."""
|
||||
if self.pressure_mbar is None:
|
||||
return None
|
||||
return self.pressure_mbar * 0.02953
|
||||
|
||||
@property
|
||||
def uptime(self) -> float:
|
||||
"""Get uptime in seconds."""
|
||||
return time.time() - self.start_time
|
||||
|
||||
def update(self, decoded: 'DecodedData') -> None:
|
||||
"""Update sensor data from a decoded packet.
|
||||
|
||||
Args:
|
||||
decoded: DecodedData object from RaymarineDecoder.decode()
|
||||
"""
|
||||
# Import here to avoid circular import
|
||||
from ..protocol.decoder import DecodedData
|
||||
|
||||
now = time.time()
|
||||
|
||||
with self._lock:
|
||||
self.packet_count += 1
|
||||
|
||||
if decoded.has_data():
|
||||
self.decode_count += 1
|
||||
|
||||
# Update GPS
|
||||
if decoded.latitude is not None and decoded.longitude is not None:
|
||||
self.latitude = decoded.latitude
|
||||
self.longitude = decoded.longitude
|
||||
self.gps_time = now
|
||||
|
||||
# Update heading
|
||||
if decoded.heading_deg is not None:
|
||||
self.heading_deg = decoded.heading_deg
|
||||
self.heading_time = now
|
||||
|
||||
# Update COG/SOG
|
||||
if decoded.cog_deg is not None:
|
||||
self.cog_deg = decoded.cog_deg
|
||||
if decoded.sog_kts is not None:
|
||||
self.sog_kts = decoded.sog_kts
|
||||
|
||||
# Update wind
|
||||
if (decoded.twd_deg is not None or decoded.tws_kts is not None or
|
||||
decoded.aws_kts is not None):
|
||||
self.wind_time = now
|
||||
if decoded.twd_deg is not None:
|
||||
self.twd_deg = decoded.twd_deg
|
||||
if decoded.tws_kts is not None:
|
||||
self.tws_kts = decoded.tws_kts
|
||||
if decoded.awa_deg is not None:
|
||||
self.awa_deg = decoded.awa_deg
|
||||
if decoded.aws_kts is not None:
|
||||
self.aws_kts = decoded.aws_kts
|
||||
|
||||
# Update depth
|
||||
if decoded.depth_m is not None:
|
||||
self.depth_m = decoded.depth_m
|
||||
self.depth_time = now
|
||||
|
||||
# Update temperature
|
||||
if decoded.water_temp_c is not None or decoded.air_temp_c is not None:
|
||||
self.temp_time = now
|
||||
if decoded.water_temp_c is not None:
|
||||
self.water_temp_c = decoded.water_temp_c
|
||||
if decoded.air_temp_c is not None:
|
||||
self.air_temp_c = decoded.air_temp_c
|
||||
|
||||
# Update pressure
|
||||
if decoded.pressure_mbar is not None:
|
||||
self.pressure_mbar = decoded.pressure_mbar
|
||||
self.pressure_time = now
|
||||
|
||||
# Update tanks
|
||||
if decoded.tanks:
|
||||
self.tanks.update(decoded.tanks)
|
||||
self.tank_time = now
|
||||
|
||||
# Update batteries
|
||||
if decoded.batteries:
|
||||
self.batteries.update(decoded.batteries)
|
||||
self.battery_time = now
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert to dictionary for JSON serialization.
|
||||
|
||||
Returns:
|
||||
Dictionary with all sensor data
|
||||
"""
|
||||
with self._lock:
|
||||
return {
|
||||
"timestamp": datetime.now().isoformat(),
|
||||
"position": {
|
||||
"latitude": self.latitude,
|
||||
"longitude": self.longitude,
|
||||
},
|
||||
"navigation": {
|
||||
"heading_deg": round(self.heading_deg, 1) if self.heading_deg else None,
|
||||
"cog_deg": round(self.cog_deg, 1) if self.cog_deg else None,
|
||||
"sog_kts": round(self.sog_kts, 1) if self.sog_kts else None,
|
||||
},
|
||||
"wind": {
|
||||
"true_direction_deg": round(self.twd_deg, 1) if self.twd_deg else None,
|
||||
"true_speed_kts": round(self.tws_kts, 1) if self.tws_kts else None,
|
||||
"apparent_angle_deg": round(self.awa_deg, 1) if self.awa_deg else None,
|
||||
"apparent_speed_kts": round(self.aws_kts, 1) if self.aws_kts else None,
|
||||
},
|
||||
"depth": {
|
||||
"meters": round(self.depth_m, 1) if self.depth_m else None,
|
||||
"feet": round(self.depth_ft, 1) if self.depth_ft else None,
|
||||
},
|
||||
"temperature": {
|
||||
"water_c": round(self.water_temp_c, 1) if self.water_temp_c else None,
|
||||
"water_f": round(self.water_temp_f, 1) if self.water_temp_f else None,
|
||||
"air_c": round(self.air_temp_c, 1) if self.air_temp_c else None,
|
||||
"air_f": round(self.air_temp_f, 1) if self.air_temp_f else None,
|
||||
},
|
||||
"pressure": {
|
||||
"mbar": round(self.pressure_mbar, 1) if self.pressure_mbar else None,
|
||||
"inhg": round(self.pressure_inhg, 2) if self.pressure_inhg else None,
|
||||
},
|
||||
"tanks": {
|
||||
str(tank_id): {
|
||||
"name": get_tank_name(tank_id),
|
||||
"level_pct": round(level, 1),
|
||||
"capacity_gal": get_tank_capacity(tank_id),
|
||||
} for tank_id, level in self.tanks.items()
|
||||
},
|
||||
"batteries": {
|
||||
str(battery_id): {
|
||||
"name": get_battery_name(battery_id),
|
||||
"voltage_v": round(voltage, 2),
|
||||
} for battery_id, voltage in self.batteries.items()
|
||||
},
|
||||
"stats": {
|
||||
"packets": self.packet_count,
|
||||
"decoded": self.decode_count,
|
||||
"uptime_s": round(self.uptime, 1),
|
||||
}
|
||||
}
|
||||
|
||||
def get_age(self, data_type: str) -> Optional[float]:
|
||||
"""Get the age of a data type in seconds.
|
||||
|
||||
Args:
|
||||
data_type: One of 'gps', 'heading', 'wind', 'depth', 'temp',
|
||||
'tank', 'battery'
|
||||
|
||||
Returns:
|
||||
Age in seconds, or None if no data has been received
|
||||
"""
|
||||
time_map = {
|
||||
'gps': self.gps_time,
|
||||
'heading': self.heading_time,
|
||||
'wind': self.wind_time,
|
||||
'depth': self.depth_time,
|
||||
'temp': self.temp_time,
|
||||
'pressure': self.pressure_time,
|
||||
'tank': self.tank_time,
|
||||
'battery': self.battery_time,
|
||||
}
|
||||
with self._lock:
|
||||
ts = time_map.get(data_type, 0)
|
||||
if ts == 0:
|
||||
return None
|
||||
return time.time() - ts
|
||||
|
||||
def is_stale(self, data_type: str, max_age: float = 10.0) -> bool:
|
||||
"""Check if a data type is stale.
|
||||
|
||||
Args:
|
||||
data_type: One of 'gps', 'heading', 'wind', 'depth', 'temp',
|
||||
'tank', 'battery'
|
||||
max_age: Maximum age in seconds before data is considered stale
|
||||
|
||||
Returns:
|
||||
True if data is stale or missing
|
||||
"""
|
||||
age = self.get_age(data_type)
|
||||
if age is None:
|
||||
return True
|
||||
return age > max_age
|
||||
|
||||
def reset(self) -> None:
|
||||
"""Reset all data and statistics."""
|
||||
with self._lock:
|
||||
self.latitude = None
|
||||
self.longitude = None
|
||||
self.heading_deg = None
|
||||
self.cog_deg = None
|
||||
self.sog_kts = None
|
||||
self.twd_deg = None
|
||||
self.tws_kts = None
|
||||
self.awa_deg = None
|
||||
self.aws_kts = None
|
||||
self.depth_m = None
|
||||
self.water_temp_c = None
|
||||
self.air_temp_c = None
|
||||
self.pressure_mbar = None
|
||||
self.tanks.clear()
|
||||
self.batteries.clear()
|
||||
self.gps_time = 0
|
||||
self.heading_time = 0
|
||||
self.wind_time = 0
|
||||
self.depth_time = 0
|
||||
self.temp_time = 0
|
||||
self.pressure_time = 0
|
||||
self.tank_time = 0
|
||||
self.battery_time = 0
|
||||
self.packet_count = 0
|
||||
self.decode_count = 0
|
||||
self.start_time = time.time()
|
||||
11
axiom-nmea/raymarine_nmea/listeners/__init__.py
Normal file
11
axiom-nmea/raymarine_nmea/listeners/__init__.py
Normal file
@@ -0,0 +1,11 @@
|
||||
"""
|
||||
Data source listeners for Raymarine data.
|
||||
|
||||
MulticastListener - Listen on UDP multicast groups for live data
|
||||
PcapReader - Read packets from PCAP files for offline analysis
|
||||
"""
|
||||
|
||||
from .multicast import MulticastListener
|
||||
from .pcap import PcapReader
|
||||
|
||||
__all__ = ["MulticastListener", "PcapReader"]
|
||||
184
axiom-nmea/raymarine_nmea/listeners/multicast.py
Normal file
184
axiom-nmea/raymarine_nmea/listeners/multicast.py
Normal file
@@ -0,0 +1,184 @@
|
||||
"""
|
||||
Multicast UDP listener for Raymarine data.
|
||||
|
||||
Listens on multiple multicast groups simultaneously and decodes
|
||||
incoming Raymarine packets.
|
||||
"""
|
||||
|
||||
import socket
|
||||
import struct
|
||||
import threading
|
||||
from typing import List, Tuple, Optional, Callable
|
||||
|
||||
from ..protocol.decoder import RaymarineDecoder, DecodedData
|
||||
from ..data.store import SensorData
|
||||
from ..sensors import MULTICAST_GROUPS
|
||||
|
||||
|
||||
class MulticastListener:
|
||||
"""Listens on Raymarine multicast groups for sensor data.
|
||||
|
||||
This class creates UDP sockets for each configured multicast group
|
||||
and receives packets in separate threads. Decoded data is stored
|
||||
in a thread-safe SensorData object.
|
||||
|
||||
Example:
|
||||
data = SensorData()
|
||||
decoder = RaymarineDecoder()
|
||||
listener = MulticastListener(
|
||||
decoder=decoder,
|
||||
sensor_data=data,
|
||||
interface_ip="198.18.5.5"
|
||||
)
|
||||
|
||||
listener.start()
|
||||
try:
|
||||
while True:
|
||||
print(f"GPS: {data.latitude}, {data.longitude}")
|
||||
time.sleep(1)
|
||||
finally:
|
||||
listener.stop()
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
decoder: RaymarineDecoder,
|
||||
sensor_data: SensorData,
|
||||
interface_ip: str,
|
||||
groups: Optional[List[Tuple[str, int]]] = None,
|
||||
on_packet: Optional[Callable[[bytes, str, int], None]] = None,
|
||||
on_decode: Optional[Callable[[DecodedData], None]] = None,
|
||||
):
|
||||
"""Initialize the multicast listener.
|
||||
|
||||
Args:
|
||||
decoder: RaymarineDecoder instance
|
||||
sensor_data: SensorData instance to store decoded values
|
||||
interface_ip: IP address of the network interface to use
|
||||
groups: List of (multicast_group, port) tuples.
|
||||
If None, uses default MULTICAST_GROUPS.
|
||||
on_packet: Optional callback for each received packet.
|
||||
Called with (packet_bytes, group, port).
|
||||
on_decode: Optional callback for each decoded packet.
|
||||
Called with DecodedData object.
|
||||
"""
|
||||
self.decoder = decoder
|
||||
self.sensor_data = sensor_data
|
||||
self.interface_ip = interface_ip
|
||||
self.groups = groups or MULTICAST_GROUPS
|
||||
self.on_packet = on_packet
|
||||
self.on_decode = on_decode
|
||||
|
||||
self.running = False
|
||||
self.sockets: List[socket.socket] = []
|
||||
self.threads: List[threading.Thread] = []
|
||||
|
||||
def _create_socket(self, group: str, port: int) -> Optional[socket.socket]:
|
||||
"""Create a UDP socket for a multicast group.
|
||||
|
||||
Args:
|
||||
group: Multicast group address
|
||||
port: UDP port number
|
||||
|
||||
Returns:
|
||||
Configured socket, or None on error
|
||||
"""
|
||||
try:
|
||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
|
||||
# Enable port reuse if available
|
||||
if hasattr(socket, 'SO_REUSEPORT'):
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)
|
||||
|
||||
sock.bind(('', port))
|
||||
|
||||
# Join multicast group
|
||||
mreq = struct.pack(
|
||||
"4s4s",
|
||||
socket.inet_aton(group),
|
||||
socket.inet_aton(self.interface_ip)
|
||||
)
|
||||
sock.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, mreq)
|
||||
|
||||
# Set timeout for graceful shutdown
|
||||
sock.settimeout(1.0)
|
||||
|
||||
return sock
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error creating socket for {group}:{port}: {e}")
|
||||
return None
|
||||
|
||||
def _listen(self, sock: socket.socket, group: str, port: int) -> None:
|
||||
"""Listen on a socket and decode packets.
|
||||
|
||||
Args:
|
||||
sock: UDP socket
|
||||
group: Multicast group (for logging)
|
||||
port: Port number (for logging)
|
||||
"""
|
||||
while self.running:
|
||||
try:
|
||||
data, addr = sock.recvfrom(65535)
|
||||
|
||||
# Optional packet callback
|
||||
if self.on_packet:
|
||||
self.on_packet(data, group, port)
|
||||
|
||||
# Decode the packet
|
||||
result = self.decoder.decode(data)
|
||||
|
||||
# Update sensor data
|
||||
self.sensor_data.update(result)
|
||||
|
||||
# Optional decode callback
|
||||
if self.on_decode and result.has_data():
|
||||
self.on_decode(result)
|
||||
|
||||
except socket.timeout:
|
||||
continue
|
||||
except Exception as e:
|
||||
if self.running:
|
||||
print(f"Error on {group}:{port}: {e}")
|
||||
|
||||
def start(self) -> None:
|
||||
"""Start listening on all multicast groups."""
|
||||
self.running = True
|
||||
|
||||
for group, port in self.groups:
|
||||
sock = self._create_socket(group, port)
|
||||
if sock:
|
||||
self.sockets.append(sock)
|
||||
thread = threading.Thread(
|
||||
target=self._listen,
|
||||
args=(sock, group, port),
|
||||
daemon=True,
|
||||
name=f"listener-{group}:{port}"
|
||||
)
|
||||
thread.start()
|
||||
self.threads.append(thread)
|
||||
print(f"Listening on {group}:{port}")
|
||||
|
||||
def stop(self) -> None:
|
||||
"""Stop all listeners and close sockets."""
|
||||
self.running = False
|
||||
|
||||
# Wait for threads to finish
|
||||
for thread in self.threads:
|
||||
thread.join(timeout=2.0)
|
||||
|
||||
# Close sockets
|
||||
for sock in self.sockets:
|
||||
try:
|
||||
sock.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
self.threads.clear()
|
||||
self.sockets.clear()
|
||||
|
||||
@property
|
||||
def is_running(self) -> bool:
|
||||
"""Check if the listener is running."""
|
||||
return self.running and any(t.is_alive() for t in self.threads)
|
||||
210
axiom-nmea/raymarine_nmea/listeners/pcap.py
Normal file
210
axiom-nmea/raymarine_nmea/listeners/pcap.py
Normal file
@@ -0,0 +1,210 @@
|
||||
"""
|
||||
PCAP file reader for offline analysis.
|
||||
|
||||
Reads Raymarine packets from PCAP capture files.
|
||||
"""
|
||||
|
||||
import struct
|
||||
from typing import List, Iterator, Tuple, Optional
|
||||
|
||||
from ..protocol.decoder import RaymarineDecoder, DecodedData
|
||||
from ..data.store import SensorData
|
||||
|
||||
|
||||
class PcapReader:
|
||||
"""Reads packets from PCAP capture files.
|
||||
|
||||
Supports standard libpcap format (both little and big endian).
|
||||
Extracts UDP payloads from Ethernet/IP/UDP frames.
|
||||
|
||||
Example:
|
||||
reader = PcapReader("capture.pcap")
|
||||
decoder = RaymarineDecoder()
|
||||
data = SensorData()
|
||||
|
||||
for packet in reader:
|
||||
result = decoder.decode(packet)
|
||||
data.update(result)
|
||||
|
||||
print(f"Processed {len(reader)} packets")
|
||||
print(f"Final GPS: {data.latitude}, {data.longitude}")
|
||||
"""
|
||||
|
||||
# Ethernet header size
|
||||
ETH_HEADER_SIZE = 14
|
||||
|
||||
# IP header minimum size
|
||||
IP_HEADER_MIN_SIZE = 20
|
||||
|
||||
# UDP header size
|
||||
UDP_HEADER_SIZE = 8
|
||||
|
||||
def __init__(self, filename: str):
|
||||
"""Initialize the PCAP reader.
|
||||
|
||||
Args:
|
||||
filename: Path to the PCAP file
|
||||
"""
|
||||
self.filename = filename
|
||||
self._packets: Optional[List[bytes]] = None
|
||||
|
||||
def _read_pcap(self) -> List[bytes]:
|
||||
"""Read and parse the PCAP file.
|
||||
|
||||
Returns:
|
||||
List of UDP payload bytes
|
||||
"""
|
||||
packets = []
|
||||
|
||||
with open(self.filename, 'rb') as f:
|
||||
# Read global header
|
||||
header = f.read(24)
|
||||
if len(header) < 24:
|
||||
return packets
|
||||
|
||||
# Check magic number to determine endianness
|
||||
magic = struct.unpack('<I', header[0:4])[0]
|
||||
if magic == 0xa1b2c3d4:
|
||||
endian = '<' # Little-endian
|
||||
elif magic == 0xd4c3b2a1:
|
||||
endian = '>' # Big-endian (swapped)
|
||||
else:
|
||||
raise ValueError(f"Invalid PCAP magic: {magic:08x}")
|
||||
|
||||
# Read packets
|
||||
while True:
|
||||
# Read packet header (16 bytes)
|
||||
pkt_header = f.read(16)
|
||||
if len(pkt_header) < 16:
|
||||
break
|
||||
|
||||
ts_sec, ts_usec, incl_len, orig_len = struct.unpack(
|
||||
f'{endian}IIII', pkt_header
|
||||
)
|
||||
|
||||
# Read packet data
|
||||
pkt_data = f.read(incl_len)
|
||||
if len(pkt_data) < incl_len:
|
||||
break
|
||||
|
||||
# Extract UDP payload
|
||||
payload = self._extract_udp_payload(pkt_data)
|
||||
if payload:
|
||||
packets.append(payload)
|
||||
|
||||
return packets
|
||||
|
||||
def _extract_udp_payload(self, frame: bytes) -> Optional[bytes]:
|
||||
"""Extract UDP payload from an Ethernet frame.
|
||||
|
||||
Args:
|
||||
frame: Raw Ethernet frame
|
||||
|
||||
Returns:
|
||||
UDP payload bytes, or None if not a UDP packet
|
||||
"""
|
||||
# Need at least Ethernet + IP + UDP headers
|
||||
min_size = self.ETH_HEADER_SIZE + self.IP_HEADER_MIN_SIZE + self.UDP_HEADER_SIZE
|
||||
if len(frame) < min_size:
|
||||
return None
|
||||
|
||||
# Check EtherType (offset 12-13)
|
||||
ethertype = struct.unpack('>H', frame[12:14])[0]
|
||||
if ethertype != 0x0800: # IPv4
|
||||
return None
|
||||
|
||||
# Parse IP header
|
||||
ip_start = self.ETH_HEADER_SIZE
|
||||
ip_version_ihl = frame[ip_start]
|
||||
ip_version = (ip_version_ihl >> 4) & 0x0F
|
||||
ip_header_len = (ip_version_ihl & 0x0F) * 4
|
||||
|
||||
if ip_version != 4:
|
||||
return None
|
||||
|
||||
# Check protocol (offset 9 in IP header)
|
||||
protocol = frame[ip_start + 9]
|
||||
if protocol != 17: # UDP
|
||||
return None
|
||||
|
||||
# UDP payload starts after IP and UDP headers
|
||||
udp_start = ip_start + ip_header_len
|
||||
payload_start = udp_start + self.UDP_HEADER_SIZE
|
||||
|
||||
if payload_start >= len(frame):
|
||||
return None
|
||||
|
||||
return frame[payload_start:]
|
||||
|
||||
@property
|
||||
def packets(self) -> List[bytes]:
|
||||
"""Get all packets from the PCAP file (cached)."""
|
||||
if self._packets is None:
|
||||
self._packets = self._read_pcap()
|
||||
return self._packets
|
||||
|
||||
def __len__(self) -> int:
|
||||
"""Return the number of packets."""
|
||||
return len(self.packets)
|
||||
|
||||
def __iter__(self) -> Iterator[bytes]:
|
||||
"""Iterate over packets."""
|
||||
return iter(self.packets)
|
||||
|
||||
def __getitem__(self, index: int) -> bytes:
|
||||
"""Get a packet by index."""
|
||||
return self.packets[index]
|
||||
|
||||
def decode_all(
|
||||
self,
|
||||
decoder: Optional[RaymarineDecoder] = None
|
||||
) -> SensorData:
|
||||
"""Decode all packets and return aggregated sensor data.
|
||||
|
||||
Args:
|
||||
decoder: RaymarineDecoder instance, or None to create one
|
||||
|
||||
Returns:
|
||||
SensorData with all decoded values
|
||||
"""
|
||||
if decoder is None:
|
||||
decoder = RaymarineDecoder()
|
||||
|
||||
data = SensorData()
|
||||
|
||||
for packet in self:
|
||||
result = decoder.decode(packet)
|
||||
data.update(result)
|
||||
|
||||
return data
|
||||
|
||||
def iter_decoded(
|
||||
self,
|
||||
decoder: Optional[RaymarineDecoder] = None
|
||||
) -> Iterator[Tuple[bytes, DecodedData]]:
|
||||
"""Iterate over packets and their decoded data.
|
||||
|
||||
Args:
|
||||
decoder: RaymarineDecoder instance, or None to create one
|
||||
|
||||
Yields:
|
||||
Tuples of (packet_bytes, DecodedData)
|
||||
"""
|
||||
if decoder is None:
|
||||
decoder = RaymarineDecoder()
|
||||
|
||||
for packet in self:
|
||||
result = decoder.decode(packet)
|
||||
yield packet, result
|
||||
|
||||
@classmethod
|
||||
def from_file(cls, filename: str) -> 'PcapReader':
|
||||
"""Create a PcapReader from a file path.
|
||||
|
||||
Args:
|
||||
filename: Path to PCAP file
|
||||
|
||||
Returns:
|
||||
PcapReader instance
|
||||
"""
|
||||
return cls(filename)
|
||||
86
axiom-nmea/raymarine_nmea/nmea/__init__.py
Normal file
86
axiom-nmea/raymarine_nmea/nmea/__init__.py
Normal file
@@ -0,0 +1,86 @@
|
||||
"""
|
||||
NMEA 0183 sentence generation module.
|
||||
|
||||
This module provides classes for generating standard NMEA 0183 sentences
|
||||
from sensor data. Supported sentence types:
|
||||
|
||||
GPS/Position:
|
||||
- GGA: GPS Fix Data
|
||||
- GLL: Geographic Position
|
||||
- RMC: Recommended Minimum
|
||||
|
||||
Navigation:
|
||||
- HDG: Heading (magnetic with deviation/variation)
|
||||
- HDT: Heading True
|
||||
- VTG: Track Made Good and Ground Speed
|
||||
- VHW: Water Speed and Heading
|
||||
|
||||
Wind:
|
||||
- MWV: Wind Speed and Angle
|
||||
- MWD: Wind Direction and Speed
|
||||
|
||||
Depth:
|
||||
- DPT: Depth
|
||||
- DBT: Depth Below Transducer
|
||||
|
||||
Temperature:
|
||||
- MTW: Water Temperature
|
||||
- MTA: Air Temperature (proprietary extension)
|
||||
|
||||
Transducer (tanks, batteries):
|
||||
- XDR: Transducer Measurements
|
||||
"""
|
||||
|
||||
from .sentence import NMEASentence
|
||||
from .generator import NMEAGenerator
|
||||
from .server import NMEATcpServer
|
||||
|
||||
# Import all sentence types
|
||||
from .sentences import (
|
||||
# GPS
|
||||
GGASentence,
|
||||
GLLSentence,
|
||||
RMCSentence,
|
||||
# Navigation
|
||||
HDGSentence,
|
||||
HDTSentence,
|
||||
VTGSentence,
|
||||
VHWSentence,
|
||||
# Wind
|
||||
MWVSentence,
|
||||
MWDSentence,
|
||||
# Depth
|
||||
DPTSentence,
|
||||
DBTSentence,
|
||||
# Temperature
|
||||
MTWSentence,
|
||||
MTASentence,
|
||||
# Transducer
|
||||
XDRSentence,
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
"NMEASentence",
|
||||
"NMEAGenerator",
|
||||
"NMEATcpServer",
|
||||
# GPS
|
||||
"GGASentence",
|
||||
"GLLSentence",
|
||||
"RMCSentence",
|
||||
# Navigation
|
||||
"HDGSentence",
|
||||
"HDTSentence",
|
||||
"VTGSentence",
|
||||
"VHWSentence",
|
||||
# Wind
|
||||
"MWVSentence",
|
||||
"MWDSentence",
|
||||
# Depth
|
||||
"DPTSentence",
|
||||
"DBTSentence",
|
||||
# Temperature
|
||||
"MTWSentence",
|
||||
"MTASentence",
|
||||
# Transducer
|
||||
"XDRSentence",
|
||||
]
|
||||
523
axiom-nmea/raymarine_nmea/nmea/generator.py
Normal file
523
axiom-nmea/raymarine_nmea/nmea/generator.py
Normal file
@@ -0,0 +1,523 @@
|
||||
"""
|
||||
NMEA sentence generator.
|
||||
|
||||
Generates complete sets of NMEA 0183 sentences from sensor data.
|
||||
"""
|
||||
|
||||
from datetime import datetime
|
||||
from typing import List, Optional, Dict, Set
|
||||
|
||||
from ..data.store import SensorData
|
||||
from ..sensors import get_tank_name, get_battery_name
|
||||
|
||||
from .sentences import (
|
||||
GGASentence,
|
||||
GLLSentence,
|
||||
RMCSentence,
|
||||
HDGSentence,
|
||||
HDTSentence,
|
||||
VTGSentence,
|
||||
VHWSentence,
|
||||
MWVSentence,
|
||||
MWDSentence,
|
||||
DPTSentence,
|
||||
DBTSentence,
|
||||
MTWSentence,
|
||||
MTASentence,
|
||||
XDRSentence,
|
||||
)
|
||||
|
||||
|
||||
class NMEAGenerator:
|
||||
"""Generates NMEA 0183 sentences from sensor data.
|
||||
|
||||
This class provides methods to generate various NMEA sentences
|
||||
from SensorData. It can generate individual sentence types or
|
||||
complete sets of all available sentences.
|
||||
|
||||
Example:
|
||||
generator = NMEAGenerator()
|
||||
|
||||
# Generate all sentences
|
||||
sentences = generator.generate_all(sensor_data)
|
||||
for sentence in sentences:
|
||||
print(sentence, end='')
|
||||
|
||||
# Generate specific sentences
|
||||
gga = generator.generate_gga(sensor_data)
|
||||
mwd = generator.generate_mwd(sensor_data)
|
||||
|
||||
Sentence Types Generated:
|
||||
GPS: GGA, GLL, RMC
|
||||
Navigation: HDG, HDT, VTG, VHW
|
||||
Wind: MWV (apparent & true), MWD
|
||||
Depth: DPT, DBT
|
||||
Temperature: MTW, MTA
|
||||
Transducers: XDR (tanks, batteries)
|
||||
"""
|
||||
|
||||
# Default magnetic variation (can be overridden)
|
||||
DEFAULT_MAG_VARIATION = 0.0
|
||||
|
||||
# Sentence types that can be enabled/disabled
|
||||
SENTENCE_TYPES = {
|
||||
'gga', 'gll', 'rmc', # GPS
|
||||
'hdg', 'hdt', 'vtg', 'vhw', # Navigation
|
||||
'mwv_apparent', 'mwv_true', 'mwd', # Wind
|
||||
'dpt', 'dbt', # Depth
|
||||
'mtw', 'mta', # Temperature
|
||||
'xdr_tanks', 'xdr_batteries', 'xdr_pressure', # Transducers
|
||||
}
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
mag_variation: Optional[float] = None,
|
||||
enabled_sentences: Optional[Set[str]] = None,
|
||||
transducer_offset: float = 0.0,
|
||||
):
|
||||
"""Initialize the NMEA generator.
|
||||
|
||||
Args:
|
||||
mag_variation: Magnetic variation in degrees (positive=East)
|
||||
enabled_sentences: Set of sentence types to generate.
|
||||
If None, all sentences are enabled.
|
||||
transducer_offset: Depth transducer offset in meters
|
||||
(positive = to waterline, negative = to keel)
|
||||
"""
|
||||
self.mag_variation = mag_variation or self.DEFAULT_MAG_VARIATION
|
||||
self.enabled = enabled_sentences or self.SENTENCE_TYPES.copy()
|
||||
self.transducer_offset = transducer_offset
|
||||
|
||||
def is_enabled(self, sentence_type: str) -> bool:
|
||||
"""Check if a sentence type is enabled."""
|
||||
return sentence_type in self.enabled
|
||||
|
||||
def enable(self, sentence_type: str) -> None:
|
||||
"""Enable a sentence type."""
|
||||
if sentence_type in self.SENTENCE_TYPES:
|
||||
self.enabled.add(sentence_type)
|
||||
|
||||
def disable(self, sentence_type: str) -> None:
|
||||
"""Disable a sentence type."""
|
||||
self.enabled.discard(sentence_type)
|
||||
|
||||
def generate_all(self, data: SensorData) -> List[str]:
|
||||
"""Generate all enabled NMEA sentences.
|
||||
|
||||
Args:
|
||||
data: SensorData object with current sensor values
|
||||
|
||||
Returns:
|
||||
List of NMEA sentence strings (with CRLF)
|
||||
"""
|
||||
sentences = []
|
||||
now = datetime.utcnow()
|
||||
|
||||
# GPS sentences
|
||||
if self.is_enabled('gga'):
|
||||
s = self.generate_gga(data, now)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
if self.is_enabled('gll'):
|
||||
s = self.generate_gll(data, now)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
if self.is_enabled('rmc'):
|
||||
s = self.generate_rmc(data, now)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
# Navigation sentences
|
||||
if self.is_enabled('hdg'):
|
||||
s = self.generate_hdg(data)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
if self.is_enabled('hdt'):
|
||||
s = self.generate_hdt(data)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
if self.is_enabled('vtg'):
|
||||
s = self.generate_vtg(data)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
if self.is_enabled('vhw'):
|
||||
s = self.generate_vhw(data)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
# Wind sentences
|
||||
if self.is_enabled('mwv_apparent'):
|
||||
s = self.generate_mwv_apparent(data)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
if self.is_enabled('mwv_true'):
|
||||
s = self.generate_mwv_true(data)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
if self.is_enabled('mwd'):
|
||||
s = self.generate_mwd(data)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
# Depth sentences
|
||||
if self.is_enabled('dpt'):
|
||||
s = self.generate_dpt(data)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
if self.is_enabled('dbt'):
|
||||
s = self.generate_dbt(data)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
# Temperature sentences
|
||||
if self.is_enabled('mtw'):
|
||||
s = self.generate_mtw(data)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
if self.is_enabled('mta'):
|
||||
s = self.generate_mta(data)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
# Transducer sentences
|
||||
if self.is_enabled('xdr_tanks'):
|
||||
s = self.generate_xdr_tanks(data)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
if self.is_enabled('xdr_batteries'):
|
||||
s = self.generate_xdr_batteries(data)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
if self.is_enabled('xdr_pressure'):
|
||||
s = self.generate_xdr_pressure(data)
|
||||
if s:
|
||||
sentences.append(s)
|
||||
|
||||
return sentences
|
||||
|
||||
# GPS Sentences
|
||||
|
||||
def generate_gga(
|
||||
self,
|
||||
data: SensorData,
|
||||
time: Optional[datetime] = None
|
||||
) -> Optional[str]:
|
||||
"""Generate GGA (GPS Fix Data) sentence."""
|
||||
with data._lock:
|
||||
lat = data.latitude
|
||||
lon = data.longitude
|
||||
|
||||
if lat is None or lon is None:
|
||||
return None
|
||||
|
||||
sentence = GGASentence(
|
||||
latitude=lat,
|
||||
longitude=lon,
|
||||
time=time,
|
||||
)
|
||||
return sentence.to_nmea()
|
||||
|
||||
def generate_gll(
|
||||
self,
|
||||
data: SensorData,
|
||||
time: Optional[datetime] = None
|
||||
) -> Optional[str]:
|
||||
"""Generate GLL (Geographic Position) sentence."""
|
||||
with data._lock:
|
||||
lat = data.latitude
|
||||
lon = data.longitude
|
||||
|
||||
if lat is None or lon is None:
|
||||
return None
|
||||
|
||||
sentence = GLLSentence(
|
||||
latitude=lat,
|
||||
longitude=lon,
|
||||
time=time,
|
||||
)
|
||||
return sentence.to_nmea()
|
||||
|
||||
def generate_rmc(
|
||||
self,
|
||||
data: SensorData,
|
||||
time: Optional[datetime] = None
|
||||
) -> Optional[str]:
|
||||
"""Generate RMC (Recommended Minimum) sentence."""
|
||||
with data._lock:
|
||||
lat = data.latitude
|
||||
lon = data.longitude
|
||||
sog = data.sog_kts
|
||||
cog = data.cog_deg
|
||||
|
||||
if lat is None or lon is None:
|
||||
return None
|
||||
|
||||
sentence = RMCSentence(
|
||||
latitude=lat,
|
||||
longitude=lon,
|
||||
time=time,
|
||||
sog=sog,
|
||||
cog=cog,
|
||||
mag_var=self.mag_variation,
|
||||
)
|
||||
return sentence.to_nmea()
|
||||
|
||||
# Navigation Sentences
|
||||
|
||||
def generate_hdg(self, data: SensorData) -> Optional[str]:
|
||||
"""Generate HDG (Heading, Deviation & Variation) sentence."""
|
||||
with data._lock:
|
||||
heading = data.heading_deg
|
||||
|
||||
if heading is None:
|
||||
return None
|
||||
|
||||
# Convert true heading to magnetic
|
||||
heading_mag = (heading - self.mag_variation) % 360
|
||||
|
||||
sentence = HDGSentence(
|
||||
heading=heading_mag,
|
||||
deviation=0.0,
|
||||
variation=self.mag_variation,
|
||||
)
|
||||
return sentence.to_nmea()
|
||||
|
||||
def generate_hdt(self, data: SensorData) -> Optional[str]:
|
||||
"""Generate HDT (Heading True) sentence."""
|
||||
with data._lock:
|
||||
heading = data.heading_deg
|
||||
|
||||
if heading is None:
|
||||
return None
|
||||
|
||||
sentence = HDTSentence(heading=heading)
|
||||
return sentence.to_nmea()
|
||||
|
||||
def generate_vtg(self, data: SensorData) -> Optional[str]:
|
||||
"""Generate VTG (Track Made Good) sentence."""
|
||||
with data._lock:
|
||||
cog = data.cog_deg
|
||||
sog = data.sog_kts
|
||||
|
||||
# Need at least one value
|
||||
if cog is None and sog is None:
|
||||
return None
|
||||
|
||||
cog_mag = None
|
||||
if cog is not None:
|
||||
cog_mag = (cog - self.mag_variation) % 360
|
||||
|
||||
sentence = VTGSentence(
|
||||
cog_true=cog,
|
||||
cog_mag=cog_mag,
|
||||
sog_kts=sog,
|
||||
)
|
||||
return sentence.to_nmea()
|
||||
|
||||
def generate_vhw(self, data: SensorData) -> Optional[str]:
|
||||
"""Generate VHW (Water Speed and Heading) sentence."""
|
||||
with data._lock:
|
||||
heading = data.heading_deg
|
||||
# Note: We don't have water speed, so just use heading
|
||||
|
||||
if heading is None:
|
||||
return None
|
||||
|
||||
heading_mag = (heading - self.mag_variation) % 360
|
||||
|
||||
sentence = VHWSentence(
|
||||
heading_true=heading,
|
||||
heading_mag=heading_mag,
|
||||
speed_kts=None, # No water speed available
|
||||
)
|
||||
return sentence.to_nmea()
|
||||
|
||||
# Wind Sentences
|
||||
|
||||
def generate_mwv_apparent(self, data: SensorData) -> Optional[str]:
|
||||
"""Generate MWV sentence for apparent wind."""
|
||||
with data._lock:
|
||||
awa = data.awa_deg
|
||||
aws = data.aws_kts
|
||||
|
||||
if awa is None and aws is None:
|
||||
return None
|
||||
|
||||
sentence = MWVSentence(
|
||||
angle=awa,
|
||||
reference="R", # Relative/Apparent
|
||||
speed=aws,
|
||||
speed_units="N", # Knots
|
||||
)
|
||||
return sentence.to_nmea()
|
||||
|
||||
def generate_mwv_true(self, data: SensorData) -> Optional[str]:
|
||||
"""Generate MWV sentence for true wind."""
|
||||
with data._lock:
|
||||
twd = data.twd_deg
|
||||
tws = data.tws_kts
|
||||
heading = data.heading_deg
|
||||
|
||||
if twd is None and tws is None:
|
||||
return None
|
||||
|
||||
# Calculate true wind angle relative to bow
|
||||
twa = None
|
||||
if twd is not None and heading is not None:
|
||||
twa = (twd - heading) % 360
|
||||
if twa > 180:
|
||||
twa = twa - 360 # Normalize to -180 to 180
|
||||
|
||||
sentence = MWVSentence(
|
||||
angle=twa if twa is not None else twd,
|
||||
reference="T", # True
|
||||
speed=tws,
|
||||
speed_units="N", # Knots
|
||||
)
|
||||
return sentence.to_nmea()
|
||||
|
||||
def generate_mwd(self, data: SensorData) -> Optional[str]:
|
||||
"""Generate MWD (Wind Direction and Speed) sentence."""
|
||||
with data._lock:
|
||||
twd = data.twd_deg
|
||||
tws = data.tws_kts
|
||||
|
||||
if twd is None and tws is None:
|
||||
return None
|
||||
|
||||
# Calculate magnetic wind direction
|
||||
twd_mag = None
|
||||
if twd is not None:
|
||||
twd_mag = (twd - self.mag_variation) % 360
|
||||
|
||||
sentence = MWDSentence(
|
||||
direction_true=twd,
|
||||
direction_mag=twd_mag,
|
||||
speed_kts=tws,
|
||||
)
|
||||
return sentence.to_nmea()
|
||||
|
||||
# Depth Sentences
|
||||
|
||||
def generate_dpt(self, data: SensorData) -> Optional[str]:
|
||||
"""Generate DPT (Depth) sentence."""
|
||||
with data._lock:
|
||||
depth = data.depth_m
|
||||
|
||||
if depth is None:
|
||||
return None
|
||||
|
||||
sentence = DPTSentence(
|
||||
depth_m=depth,
|
||||
offset_m=self.transducer_offset,
|
||||
)
|
||||
return sentence.to_nmea()
|
||||
|
||||
def generate_dbt(self, data: SensorData) -> Optional[str]:
|
||||
"""Generate DBT (Depth Below Transducer) sentence."""
|
||||
with data._lock:
|
||||
depth = data.depth_m
|
||||
|
||||
if depth is None:
|
||||
return None
|
||||
|
||||
sentence = DBTSentence(depth_m=depth)
|
||||
return sentence.to_nmea()
|
||||
|
||||
# Temperature Sentences
|
||||
|
||||
def generate_mtw(self, data: SensorData) -> Optional[str]:
|
||||
"""Generate MTW (Water Temperature) sentence."""
|
||||
with data._lock:
|
||||
temp = data.water_temp_c
|
||||
|
||||
if temp is None:
|
||||
return None
|
||||
|
||||
sentence = MTWSentence(temp_c=temp)
|
||||
return sentence.to_nmea()
|
||||
|
||||
def generate_mta(self, data: SensorData) -> Optional[str]:
|
||||
"""Generate MTA (Air Temperature) sentence."""
|
||||
with data._lock:
|
||||
temp = data.air_temp_c
|
||||
|
||||
if temp is None:
|
||||
return None
|
||||
|
||||
sentence = MTASentence(temp_c=temp)
|
||||
return sentence.to_nmea()
|
||||
|
||||
# Transducer Sentences
|
||||
|
||||
def generate_xdr_tanks(self, data: SensorData) -> Optional[str]:
|
||||
"""Generate XDR sentence for tank levels."""
|
||||
with data._lock:
|
||||
tanks = dict(data.tanks)
|
||||
|
||||
if not tanks:
|
||||
return None
|
||||
|
||||
# Build name mapping
|
||||
names = {tid: get_tank_name(tid) for tid in tanks}
|
||||
|
||||
sentence = XDRSentence.for_tanks(tanks, names)
|
||||
return sentence.to_nmea()
|
||||
|
||||
def generate_xdr_batteries(self, data: SensorData) -> Optional[str]:
|
||||
"""Generate XDR sentence for battery voltages."""
|
||||
with data._lock:
|
||||
batteries = dict(data.batteries)
|
||||
|
||||
if not batteries:
|
||||
return None
|
||||
|
||||
# Build name mapping
|
||||
names = {bid: get_battery_name(bid) for bid in batteries}
|
||||
|
||||
sentence = XDRSentence.for_batteries(batteries, names)
|
||||
return sentence.to_nmea()
|
||||
|
||||
def generate_xdr_pressure(self, data: SensorData) -> Optional[str]:
|
||||
"""Generate XDR sentence for barometric pressure."""
|
||||
with data._lock:
|
||||
pressure = data.pressure_mbar
|
||||
|
||||
if pressure is None:
|
||||
return None
|
||||
|
||||
sentence = XDRSentence.for_pressure(pressure)
|
||||
return sentence.to_nmea()
|
||||
|
||||
def generate_xdr_all(self, data: SensorData) -> List[str]:
|
||||
"""Generate all XDR sentences (tanks, batteries, pressure).
|
||||
|
||||
Returns separate sentences for each type to avoid
|
||||
exceeding NMEA sentence length limits.
|
||||
"""
|
||||
sentences = []
|
||||
|
||||
tanks_xdr = self.generate_xdr_tanks(data)
|
||||
if tanks_xdr:
|
||||
sentences.append(tanks_xdr)
|
||||
|
||||
batteries_xdr = self.generate_xdr_batteries(data)
|
||||
if batteries_xdr:
|
||||
sentences.append(batteries_xdr)
|
||||
|
||||
pressure_xdr = self.generate_xdr_pressure(data)
|
||||
if pressure_xdr:
|
||||
sentences.append(pressure_xdr)
|
||||
|
||||
return sentences
|
||||
152
axiom-nmea/raymarine_nmea/nmea/sentence.py
Normal file
152
axiom-nmea/raymarine_nmea/nmea/sentence.py
Normal file
@@ -0,0 +1,152 @@
|
||||
"""
|
||||
Base class for NMEA 0183 sentences.
|
||||
|
||||
NMEA 0183 Sentence Format:
|
||||
$XXYYY,field1,field2,...,fieldN*CC<CR><LF>
|
||||
|
||||
Where:
|
||||
$ = Start delimiter
|
||||
XX = Talker ID (e.g., GP, II, WI)
|
||||
YYY = Sentence type (e.g., GGA, RMC)
|
||||
, = Field delimiter
|
||||
* = Checksum delimiter
|
||||
CC = Two-digit hex checksum (XOR of all chars between $ and *)
|
||||
<CR><LF> = Carriage return and line feed
|
||||
"""
|
||||
|
||||
from abc import ABC, abstractmethod
|
||||
from datetime import datetime
|
||||
from typing import Optional
|
||||
|
||||
|
||||
class NMEASentence(ABC):
|
||||
"""Abstract base class for NMEA 0183 sentences.
|
||||
|
||||
Subclasses must implement:
|
||||
- sentence_type: The 3-character sentence type (e.g., "GGA")
|
||||
- format_fields(): Returns the comma-separated field data
|
||||
|
||||
Example:
|
||||
class GGASentence(NMEASentence):
|
||||
sentence_type = "GGA"
|
||||
|
||||
def format_fields(self) -> str:
|
||||
return "123456.00,2456.123,N,08037.456,W,1,08,0.9,..."
|
||||
"""
|
||||
|
||||
# Talker ID for this sentence (default: II for integrated instrumentation)
|
||||
talker_id: str = "II"
|
||||
|
||||
# Sentence type (e.g., "GGA", "RMC")
|
||||
sentence_type: str = ""
|
||||
|
||||
@abstractmethod
|
||||
def format_fields(self) -> Optional[str]:
|
||||
"""Format the sentence fields.
|
||||
|
||||
Returns:
|
||||
Comma-separated field string, or None if sentence cannot be generated
|
||||
"""
|
||||
pass
|
||||
|
||||
@staticmethod
|
||||
def calculate_checksum(sentence: str) -> str:
|
||||
"""Calculate NMEA checksum.
|
||||
|
||||
The checksum is the XOR of all characters between $ and *.
|
||||
|
||||
Args:
|
||||
sentence: The sentence content (without $ prefix and * suffix)
|
||||
|
||||
Returns:
|
||||
Two-character hex checksum
|
||||
"""
|
||||
checksum = 0
|
||||
for char in sentence:
|
||||
checksum ^= ord(char)
|
||||
return f"{checksum:02X}"
|
||||
|
||||
def to_nmea(self) -> Optional[str]:
|
||||
"""Generate the complete NMEA sentence.
|
||||
|
||||
Returns:
|
||||
Complete NMEA sentence with $ prefix, checksum, and CRLF,
|
||||
or None if sentence cannot be generated
|
||||
"""
|
||||
fields = self.format_fields()
|
||||
if fields is None:
|
||||
return None
|
||||
|
||||
# Build sentence content (between $ and *)
|
||||
content = f"{self.talker_id}{self.sentence_type},{fields}"
|
||||
|
||||
# Calculate checksum
|
||||
checksum = self.calculate_checksum(content)
|
||||
|
||||
# Return complete sentence
|
||||
return f"${content}*{checksum}\r\n"
|
||||
|
||||
def __str__(self) -> str:
|
||||
"""Return the NMEA sentence as a string."""
|
||||
result = self.to_nmea()
|
||||
return result if result else ""
|
||||
|
||||
@staticmethod
|
||||
def format_latitude(lat: float) -> str:
|
||||
"""Format latitude for NMEA (DDMM.MMMMM,N/S).
|
||||
|
||||
Args:
|
||||
lat: Latitude in decimal degrees (-90 to 90)
|
||||
|
||||
Returns:
|
||||
Formatted string like "2456.12345,N"
|
||||
"""
|
||||
hemisphere = 'N' if lat >= 0 else 'S'
|
||||
lat = abs(lat)
|
||||
degrees = int(lat)
|
||||
minutes = (lat - degrees) * 60
|
||||
return f"{degrees:02d}{minutes:09.6f},{hemisphere}"
|
||||
|
||||
@staticmethod
|
||||
def format_longitude(lon: float) -> str:
|
||||
"""Format longitude for NMEA (DDDMM.MMMMM,E/W).
|
||||
|
||||
Args:
|
||||
lon: Longitude in decimal degrees (-180 to 180)
|
||||
|
||||
Returns:
|
||||
Formatted string like "08037.45678,W"
|
||||
"""
|
||||
hemisphere = 'E' if lon >= 0 else 'W'
|
||||
lon = abs(lon)
|
||||
degrees = int(lon)
|
||||
minutes = (lon - degrees) * 60
|
||||
return f"{degrees:03d}{minutes:09.6f},{hemisphere}"
|
||||
|
||||
@staticmethod
|
||||
def format_time(dt: Optional[datetime] = None) -> str:
|
||||
"""Format time for NMEA (HHMMSS.SS).
|
||||
|
||||
Args:
|
||||
dt: Datetime object, or None for current time
|
||||
|
||||
Returns:
|
||||
Formatted string like "123456.00"
|
||||
"""
|
||||
if dt is None:
|
||||
dt = datetime.utcnow()
|
||||
return dt.strftime("%H%M%S.00")
|
||||
|
||||
@staticmethod
|
||||
def format_date(dt: Optional[datetime] = None) -> str:
|
||||
"""Format date for NMEA (DDMMYY).
|
||||
|
||||
Args:
|
||||
dt: Datetime object, or None for current time
|
||||
|
||||
Returns:
|
||||
Formatted string like "231224"
|
||||
"""
|
||||
if dt is None:
|
||||
dt = datetime.utcnow()
|
||||
return dt.strftime("%d%m%y")
|
||||
33
axiom-nmea/raymarine_nmea/nmea/sentences/__init__.py
Normal file
33
axiom-nmea/raymarine_nmea/nmea/sentences/__init__.py
Normal file
@@ -0,0 +1,33 @@
|
||||
"""
|
||||
NMEA sentence implementations.
|
||||
"""
|
||||
|
||||
from .gps import GGASentence, GLLSentence, RMCSentence
|
||||
from .navigation import HDGSentence, HDTSentence, VTGSentence, VHWSentence
|
||||
from .wind import MWVSentence, MWDSentence
|
||||
from .depth import DPTSentence, DBTSentence
|
||||
from .temperature import MTWSentence, MTASentence
|
||||
from .transducer import XDRSentence
|
||||
|
||||
__all__ = [
|
||||
# GPS
|
||||
"GGASentence",
|
||||
"GLLSentence",
|
||||
"RMCSentence",
|
||||
# Navigation
|
||||
"HDGSentence",
|
||||
"HDTSentence",
|
||||
"VTGSentence",
|
||||
"VHWSentence",
|
||||
# Wind
|
||||
"MWVSentence",
|
||||
"MWDSentence",
|
||||
# Depth
|
||||
"DPTSentence",
|
||||
"DBTSentence",
|
||||
# Temperature
|
||||
"MTWSentence",
|
||||
"MTASentence",
|
||||
# Transducer
|
||||
"XDRSentence",
|
||||
]
|
||||
113
axiom-nmea/raymarine_nmea/nmea/sentences/depth.py
Normal file
113
axiom-nmea/raymarine_nmea/nmea/sentences/depth.py
Normal file
@@ -0,0 +1,113 @@
|
||||
"""
|
||||
Depth-related NMEA sentences.
|
||||
|
||||
DPT - Depth
|
||||
DBT - Depth Below Transducer
|
||||
"""
|
||||
|
||||
from typing import Optional
|
||||
|
||||
from ..sentence import NMEASentence
|
||||
|
||||
|
||||
class DPTSentence(NMEASentence):
|
||||
"""DPT - Depth of Water.
|
||||
|
||||
Format:
|
||||
$IIDPT,D.D,O.O,R.R*CC
|
||||
|
||||
Fields:
|
||||
1. Depth in meters (relative to transducer)
|
||||
2. Offset from transducer in meters
|
||||
- Positive = distance from transducer to water line
|
||||
- Negative = distance from transducer to keel
|
||||
3. Maximum range scale in use (optional)
|
||||
|
||||
Example:
|
||||
$IIDPT,12.5,0.5,100*4C
|
||||
|
||||
Note:
|
||||
To get depth below keel: depth + offset (when offset is negative)
|
||||
To get depth below surface: depth + offset (when offset is positive)
|
||||
"""
|
||||
|
||||
talker_id = "II"
|
||||
sentence_type = "DPT"
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
depth_m: Optional[float] = None,
|
||||
offset_m: float = 0.0,
|
||||
max_range: Optional[float] = None,
|
||||
):
|
||||
"""Initialize DPT sentence.
|
||||
|
||||
Args:
|
||||
depth_m: Depth in meters (relative to transducer)
|
||||
offset_m: Offset from transducer in meters
|
||||
max_range: Maximum range scale in meters
|
||||
"""
|
||||
self.depth_m = depth_m
|
||||
self.offset_m = offset_m
|
||||
self.max_range = max_range
|
||||
|
||||
def format_fields(self) -> Optional[str]:
|
||||
"""Format DPT fields."""
|
||||
if self.depth_m is None:
|
||||
return None
|
||||
|
||||
range_str = f"{self.max_range:.0f}" if self.max_range is not None else ""
|
||||
|
||||
return f"{self.depth_m:.1f},{self.offset_m:.1f},{range_str}"
|
||||
|
||||
|
||||
class DBTSentence(NMEASentence):
|
||||
"""DBT - Depth Below Transducer.
|
||||
|
||||
Format:
|
||||
$IIDBT,D.D,f,D.D,M,D.D,F*CC
|
||||
|
||||
Fields:
|
||||
1. Depth in feet
|
||||
2. f = Feet
|
||||
3. Depth in meters
|
||||
4. M = Meters
|
||||
5. Depth in fathoms
|
||||
6. F = Fathoms
|
||||
|
||||
Example:
|
||||
$IIDBT,41.0,f,12.5,M,6.8,F*2B
|
||||
|
||||
Note:
|
||||
Depth is measured from the transducer to the bottom.
|
||||
Does not include offset to keel or waterline.
|
||||
"""
|
||||
|
||||
talker_id = "II"
|
||||
sentence_type = "DBT"
|
||||
|
||||
# Conversion constants
|
||||
FEET_PER_METER = 3.28084
|
||||
FATHOMS_PER_METER = 0.546807
|
||||
|
||||
def __init__(self, depth_m: Optional[float] = None):
|
||||
"""Initialize DBT sentence.
|
||||
|
||||
Args:
|
||||
depth_m: Depth in meters
|
||||
"""
|
||||
self.depth_m = depth_m
|
||||
|
||||
def format_fields(self) -> Optional[str]:
|
||||
"""Format DBT fields."""
|
||||
if self.depth_m is None:
|
||||
return None
|
||||
|
||||
depth_ft = self.depth_m * self.FEET_PER_METER
|
||||
depth_fathoms = self.depth_m * self.FATHOMS_PER_METER
|
||||
|
||||
return (
|
||||
f"{depth_ft:.1f},f,"
|
||||
f"{self.depth_m:.1f},M,"
|
||||
f"{depth_fathoms:.1f},F"
|
||||
)
|
||||
266
axiom-nmea/raymarine_nmea/nmea/sentences/gps.py
Normal file
266
axiom-nmea/raymarine_nmea/nmea/sentences/gps.py
Normal file
@@ -0,0 +1,266 @@
|
||||
"""
|
||||
GPS-related NMEA sentences.
|
||||
|
||||
GGA - GPS Fix Data
|
||||
GLL - Geographic Position
|
||||
RMC - Recommended Minimum Navigation Information
|
||||
"""
|
||||
|
||||
from datetime import datetime
|
||||
from typing import Optional
|
||||
|
||||
from ..sentence import NMEASentence
|
||||
|
||||
|
||||
class GGASentence(NMEASentence):
|
||||
"""GGA - GPS Fix Data.
|
||||
|
||||
Format:
|
||||
$GPGGA,HHMMSS.SS,DDMM.MMMMM,N,DDDMM.MMMMM,W,Q,SS,H.H,A.A,M,G.G,M,A.A,XXXX*CC
|
||||
|
||||
Fields:
|
||||
1. Time (UTC) - HHMMSS.SS
|
||||
2. Latitude - DDMM.MMMMM
|
||||
3. N/S indicator
|
||||
4. Longitude - DDDMM.MMMMM
|
||||
5. E/W indicator
|
||||
6. GPS Quality (0=invalid, 1=GPS, 2=DGPS, 4=RTK fixed, 5=RTK float)
|
||||
7. Number of satellites
|
||||
8. HDOP (Horizontal Dilution of Precision)
|
||||
9. Altitude above mean sea level
|
||||
10. Altitude units (M)
|
||||
11. Geoidal separation
|
||||
12. Geoidal separation units (M)
|
||||
13. Age of differential GPS data
|
||||
14. Differential reference station ID
|
||||
|
||||
Example:
|
||||
$GPGGA,123519.00,4807.038000,N,01131.000000,E,1,08,0.9,545.4,M,47.0,M,,*47
|
||||
"""
|
||||
|
||||
talker_id = "GP"
|
||||
sentence_type = "GGA"
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
latitude: Optional[float] = None,
|
||||
longitude: Optional[float] = None,
|
||||
time: Optional[datetime] = None,
|
||||
quality: int = 1,
|
||||
num_satellites: int = 8,
|
||||
hdop: float = 1.0,
|
||||
altitude: Optional[float] = None,
|
||||
geoid_sep: Optional[float] = None,
|
||||
):
|
||||
"""Initialize GGA sentence.
|
||||
|
||||
Args:
|
||||
latitude: Latitude in decimal degrees
|
||||
longitude: Longitude in decimal degrees
|
||||
time: UTC time, or None for current time
|
||||
quality: GPS quality indicator (1=GPS, 2=DGPS)
|
||||
num_satellites: Number of satellites in use
|
||||
hdop: Horizontal dilution of precision
|
||||
altitude: Altitude above MSL in meters
|
||||
geoid_sep: Geoidal separation in meters
|
||||
"""
|
||||
self.latitude = latitude
|
||||
self.longitude = longitude
|
||||
self.time = time
|
||||
self.quality = quality
|
||||
self.num_satellites = num_satellites
|
||||
self.hdop = hdop
|
||||
self.altitude = altitude
|
||||
self.geoid_sep = geoid_sep
|
||||
|
||||
def format_fields(self) -> Optional[str]:
|
||||
"""Format GGA fields."""
|
||||
if self.latitude is None or self.longitude is None:
|
||||
return None
|
||||
|
||||
time_str = self.format_time(self.time)
|
||||
lat_str = self.format_latitude(self.latitude)
|
||||
lon_str = self.format_longitude(self.longitude)
|
||||
|
||||
# Format altitude - not yet available from Raymarine
|
||||
if self.altitude is not None:
|
||||
alt_str = f"{self.altitude:.1f},M"
|
||||
else:
|
||||
alt_str = "0.0,M" # Default to 0.0 for parsers that require a value
|
||||
|
||||
# Format geoidal separation - not available from Raymarine
|
||||
if self.geoid_sep is not None:
|
||||
geoid_str = f"{self.geoid_sep:.1f},M"
|
||||
else:
|
||||
geoid_str = "-22.0,M" # Default typical value for compatibility
|
||||
|
||||
return (
|
||||
f"{time_str},"
|
||||
f"{lat_str},"
|
||||
f"{lon_str},"
|
||||
f"{self.quality},"
|
||||
f"{self.num_satellites:02d},"
|
||||
f"{self.hdop:.1f},"
|
||||
f"{alt_str},"
|
||||
f"{geoid_str},"
|
||||
f"," # Age of DGPS
|
||||
) # DGPS station ID (empty)
|
||||
|
||||
|
||||
class GLLSentence(NMEASentence):
|
||||
"""GLL - Geographic Position (Latitude/Longitude).
|
||||
|
||||
Format:
|
||||
$GPGLL,DDMM.MMMMM,N,DDDMM.MMMMM,W,HHMMSS.SS,A,A*CC
|
||||
|
||||
Fields:
|
||||
1. Latitude - DDMM.MMMMM
|
||||
2. N/S indicator
|
||||
3. Longitude - DDDMM.MMMMM
|
||||
4. E/W indicator
|
||||
5. Time (UTC) - HHMMSS.SS
|
||||
6. Status (A=valid, V=invalid)
|
||||
7. Mode indicator (A=autonomous, D=differential, N=invalid)
|
||||
|
||||
Example:
|
||||
$GPGLL,4916.45000,N,12311.12000,W,225444.00,A,A*6A
|
||||
"""
|
||||
|
||||
talker_id = "GP"
|
||||
sentence_type = "GLL"
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
latitude: Optional[float] = None,
|
||||
longitude: Optional[float] = None,
|
||||
time: Optional[datetime] = None,
|
||||
status: str = "A",
|
||||
mode: str = "A",
|
||||
):
|
||||
"""Initialize GLL sentence.
|
||||
|
||||
Args:
|
||||
latitude: Latitude in decimal degrees
|
||||
longitude: Longitude in decimal degrees
|
||||
time: UTC time, or None for current time
|
||||
status: Status (A=valid, V=invalid)
|
||||
mode: Mode indicator (A=autonomous, D=differential)
|
||||
"""
|
||||
self.latitude = latitude
|
||||
self.longitude = longitude
|
||||
self.time = time
|
||||
self.status = status
|
||||
self.mode = mode
|
||||
|
||||
def format_fields(self) -> Optional[str]:
|
||||
"""Format GLL fields."""
|
||||
if self.latitude is None or self.longitude is None:
|
||||
return None
|
||||
|
||||
lat_str = self.format_latitude(self.latitude)
|
||||
lon_str = self.format_longitude(self.longitude)
|
||||
time_str = self.format_time(self.time)
|
||||
|
||||
return f"{lat_str},{lon_str},{time_str},{self.status},{self.mode}"
|
||||
|
||||
|
||||
class RMCSentence(NMEASentence):
|
||||
"""RMC - Recommended Minimum Navigation Information.
|
||||
|
||||
Format:
|
||||
$GPRMC,HHMMSS.SS,A,DDMM.MMMMM,N,DDDMM.MMMMM,W,S.S,C.C,DDMMYY,M.M,E,A*CC
|
||||
|
||||
Fields:
|
||||
1. Time (UTC) - HHMMSS.SS
|
||||
2. Status (A=valid, V=warning)
|
||||
3. Latitude - DDMM.MMMMM
|
||||
4. N/S indicator
|
||||
5. Longitude - DDDMM.MMMMM
|
||||
6. E/W indicator
|
||||
7. Speed over ground (knots)
|
||||
8. Course over ground (degrees true)
|
||||
9. Date - DDMMYY
|
||||
10. Magnetic variation (degrees)
|
||||
11. Magnetic variation direction (E/W)
|
||||
12. Mode indicator (A=autonomous, D=differential)
|
||||
|
||||
Example:
|
||||
$GPRMC,225446.00,A,4916.45000,N,12311.12000,W,000.5,054.7,191194,020.3,E,A*68
|
||||
"""
|
||||
|
||||
talker_id = "GP"
|
||||
sentence_type = "RMC"
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
latitude: Optional[float] = None,
|
||||
longitude: Optional[float] = None,
|
||||
time: Optional[datetime] = None,
|
||||
status: str = "A",
|
||||
sog: Optional[float] = None,
|
||||
cog: Optional[float] = None,
|
||||
mag_var: Optional[float] = None,
|
||||
mode: str = "A",
|
||||
):
|
||||
"""Initialize RMC sentence.
|
||||
|
||||
Args:
|
||||
latitude: Latitude in decimal degrees
|
||||
longitude: Longitude in decimal degrees
|
||||
time: UTC time, or None for current time
|
||||
status: Status (A=valid, V=warning)
|
||||
sog: Speed over ground in knots
|
||||
cog: Course over ground in degrees true
|
||||
mag_var: Magnetic variation in degrees (positive=E, negative=W)
|
||||
mode: Mode indicator (A=autonomous, D=differential)
|
||||
"""
|
||||
self.latitude = latitude
|
||||
self.longitude = longitude
|
||||
self.time = time
|
||||
self.status = status
|
||||
self.sog = sog
|
||||
self.cog = cog
|
||||
self.mag_var = mag_var
|
||||
self.mode = mode
|
||||
|
||||
def format_fields(self) -> Optional[str]:
|
||||
"""Format RMC fields."""
|
||||
if self.latitude is None or self.longitude is None:
|
||||
return None
|
||||
|
||||
dt = self.time if self.time else datetime.utcnow()
|
||||
time_str = self.format_time(dt)
|
||||
date_str = self.format_date(dt)
|
||||
lat_str = self.format_latitude(self.latitude)
|
||||
lon_str = self.format_longitude(self.longitude)
|
||||
|
||||
# Format SOG - sourced from Field 5.3
|
||||
if self.sog is not None:
|
||||
sog_str = f"{self.sog:.1f}"
|
||||
else:
|
||||
sog_str = "0.0" # Default to 0.0 for parsers that require a value
|
||||
|
||||
# Format COG - sourced from Field 5.1
|
||||
if self.cog is not None:
|
||||
cog_str = f"{self.cog:.1f}"
|
||||
else:
|
||||
cog_str = "0.0" # Default to 0.0 for parsers that require a value
|
||||
|
||||
# Format magnetic variation
|
||||
if self.mag_var is not None:
|
||||
mag_dir = 'E' if self.mag_var >= 0 else 'W'
|
||||
mag_str = f"{abs(self.mag_var):.1f},{mag_dir}"
|
||||
else:
|
||||
mag_str = ","
|
||||
|
||||
return (
|
||||
f"{time_str},"
|
||||
f"{self.status},"
|
||||
f"{lat_str},"
|
||||
f"{lon_str},"
|
||||
f"{sog_str},"
|
||||
f"{cog_str},"
|
||||
f"{date_str},"
|
||||
f"{mag_str},"
|
||||
f"{self.mode}"
|
||||
)
|
||||
252
axiom-nmea/raymarine_nmea/nmea/sentences/navigation.py
Normal file
252
axiom-nmea/raymarine_nmea/nmea/sentences/navigation.py
Normal file
@@ -0,0 +1,252 @@
|
||||
"""
|
||||
Navigation-related NMEA sentences.
|
||||
|
||||
HDG - Heading (Deviation & Variation)
|
||||
HDT - Heading True
|
||||
VTG - Track Made Good and Ground Speed
|
||||
VHW - Water Speed and Heading
|
||||
"""
|
||||
|
||||
from typing import Optional
|
||||
|
||||
from ..sentence import NMEASentence
|
||||
|
||||
|
||||
class HDGSentence(NMEASentence):
|
||||
"""HDG - Heading, Deviation & Variation.
|
||||
|
||||
Format:
|
||||
$IIHDG,H.H,D.D,E,V.V,E*CC
|
||||
|
||||
Fields:
|
||||
1. Heading (magnetic) in degrees
|
||||
2. Magnetic deviation in degrees
|
||||
3. Deviation direction (E/W)
|
||||
4. Magnetic variation in degrees
|
||||
5. Variation direction (E/W)
|
||||
|
||||
Example:
|
||||
$IIHDG,238.5,0.0,E,12.6,W*5F
|
||||
"""
|
||||
|
||||
talker_id = "II"
|
||||
sentence_type = "HDG"
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
heading: Optional[float] = None,
|
||||
deviation: Optional[float] = None,
|
||||
variation: Optional[float] = None,
|
||||
):
|
||||
"""Initialize HDG sentence.
|
||||
|
||||
Args:
|
||||
heading: Magnetic heading in degrees (0-360)
|
||||
deviation: Magnetic deviation in degrees (positive=E, negative=W)
|
||||
variation: Magnetic variation in degrees (positive=E, negative=W)
|
||||
"""
|
||||
self.heading = heading
|
||||
self.deviation = deviation
|
||||
self.variation = variation
|
||||
|
||||
def format_fields(self) -> Optional[str]:
|
||||
"""Format HDG fields."""
|
||||
if self.heading is None:
|
||||
return None
|
||||
|
||||
heading_str = f"{self.heading:.1f}"
|
||||
|
||||
# Format deviation - not yet available from Raymarine
|
||||
if self.deviation is not None:
|
||||
dev_dir = 'E' if self.deviation >= 0 else 'W'
|
||||
dev_str = f"{abs(self.deviation):.1f},{dev_dir}"
|
||||
else:
|
||||
dev_str = "0.0,E" # Default to 0.0 for parsers that require a value
|
||||
|
||||
# Format variation - configured value, not from Raymarine
|
||||
if self.variation is not None:
|
||||
var_dir = 'E' if self.variation >= 0 else 'W'
|
||||
var_str = f"{abs(self.variation):.1f},{var_dir}"
|
||||
else:
|
||||
var_str = "0.0,E" # Default to 0.0 for parsers that require a value
|
||||
|
||||
return f"{heading_str},{dev_str},{var_str}"
|
||||
|
||||
|
||||
class HDTSentence(NMEASentence):
|
||||
"""HDT - Heading True.
|
||||
|
||||
Format:
|
||||
$IIHDT,H.H,T*CC
|
||||
|
||||
Fields:
|
||||
1. Heading (true) in degrees
|
||||
2. T = True
|
||||
|
||||
Example:
|
||||
$IIHDT,238.5,T*1C
|
||||
"""
|
||||
|
||||
talker_id = "II"
|
||||
sentence_type = "HDT"
|
||||
|
||||
def __init__(self, heading: Optional[float] = None):
|
||||
"""Initialize HDT sentence.
|
||||
|
||||
Args:
|
||||
heading: True heading in degrees (0-360)
|
||||
"""
|
||||
self.heading = heading
|
||||
|
||||
def format_fields(self) -> Optional[str]:
|
||||
"""Format HDT fields."""
|
||||
if self.heading is None:
|
||||
return None
|
||||
return f"{self.heading:.1f},T"
|
||||
|
||||
|
||||
class VTGSentence(NMEASentence):
|
||||
"""VTG - Track Made Good and Ground Speed.
|
||||
|
||||
Format:
|
||||
$IIVTG,C.C,T,C.C,M,S.S,N,S.S,K,M*CC
|
||||
|
||||
Fields:
|
||||
1. Course over ground (true) in degrees
|
||||
2. T = True
|
||||
3. Course over ground (magnetic) in degrees
|
||||
4. M = Magnetic
|
||||
5. Speed over ground in knots
|
||||
6. N = Knots
|
||||
7. Speed over ground in km/h
|
||||
8. K = Km/h
|
||||
9. Mode indicator (A=autonomous, D=differential)
|
||||
|
||||
Example:
|
||||
$IIVTG,054.7,T,034.4,M,005.5,N,010.2,K,A*28
|
||||
"""
|
||||
|
||||
talker_id = "II"
|
||||
sentence_type = "VTG"
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
cog_true: Optional[float] = None,
|
||||
cog_mag: Optional[float] = None,
|
||||
sog_kts: Optional[float] = None,
|
||||
mode: str = "A",
|
||||
):
|
||||
"""Initialize VTG sentence.
|
||||
|
||||
Args:
|
||||
cog_true: Course over ground (true) in degrees
|
||||
cog_mag: Course over ground (magnetic) in degrees
|
||||
sog_kts: Speed over ground in knots
|
||||
mode: Mode indicator (A=autonomous, D=differential)
|
||||
"""
|
||||
self.cog_true = cog_true
|
||||
self.cog_mag = cog_mag
|
||||
self.sog_kts = sog_kts
|
||||
self.mode = mode
|
||||
|
||||
def format_fields(self) -> Optional[str]:
|
||||
"""Format VTG fields."""
|
||||
# Format COG true - sourced from Field 5.1
|
||||
if self.cog_true is not None:
|
||||
cog_t_str = f"{self.cog_true:.1f}"
|
||||
else:
|
||||
cog_t_str = "0.0" # Default to 0.0 for parsers that require a value
|
||||
|
||||
# Format COG magnetic - derived from true COG
|
||||
if self.cog_mag is not None:
|
||||
cog_m_str = f"{self.cog_mag:.1f}"
|
||||
else:
|
||||
cog_m_str = "0.0" # Default to 0.0 for parsers that require a value
|
||||
|
||||
# Format SOG - sourced from Field 5.3
|
||||
if self.sog_kts is not None:
|
||||
sog_kts_str = f"{self.sog_kts:.1f}"
|
||||
sog_kmh = self.sog_kts * 1.852
|
||||
sog_kmh_str = f"{sog_kmh:.1f}"
|
||||
else:
|
||||
sog_kts_str = "0.0" # Default to 0.0 for parsers that require a value
|
||||
sog_kmh_str = "0.0"
|
||||
|
||||
return (
|
||||
f"{cog_t_str},T,"
|
||||
f"{cog_m_str},M,"
|
||||
f"{sog_kts_str},N,"
|
||||
f"{sog_kmh_str},K,"
|
||||
f"{self.mode}"
|
||||
)
|
||||
|
||||
|
||||
class VHWSentence(NMEASentence):
|
||||
"""VHW - Water Speed and Heading.
|
||||
|
||||
Format:
|
||||
$IIVHW,H.H,T,H.H,M,S.S,N,S.S,K*CC
|
||||
|
||||
Fields:
|
||||
1. Heading (true) in degrees
|
||||
2. T = True
|
||||
3. Heading (magnetic) in degrees
|
||||
4. M = Magnetic
|
||||
5. Speed (water) in knots
|
||||
6. N = Knots
|
||||
7. Speed (water) in km/h
|
||||
8. K = Km/h
|
||||
|
||||
Example:
|
||||
$IIVHW,238.5,T,225.9,M,4.5,N,8.3,K*5D
|
||||
"""
|
||||
|
||||
talker_id = "II"
|
||||
sentence_type = "VHW"
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
heading_true: Optional[float] = None,
|
||||
heading_mag: Optional[float] = None,
|
||||
speed_kts: Optional[float] = None,
|
||||
):
|
||||
"""Initialize VHW sentence.
|
||||
|
||||
Args:
|
||||
heading_true: Heading (true) in degrees
|
||||
heading_mag: Heading (magnetic) in degrees
|
||||
speed_kts: Speed through water in knots
|
||||
"""
|
||||
self.heading_true = heading_true
|
||||
self.heading_mag = heading_mag
|
||||
self.speed_kts = speed_kts
|
||||
|
||||
def format_fields(self) -> Optional[str]:
|
||||
"""Format VHW fields."""
|
||||
# Format headings - true heading sourced from Field 3.2
|
||||
if self.heading_true is not None:
|
||||
hdg_t_str = f"{self.heading_true:.1f}"
|
||||
else:
|
||||
hdg_t_str = "0.0" # Default to 0.0 for parsers that require a value
|
||||
|
||||
# Magnetic heading derived from true heading
|
||||
if self.heading_mag is not None:
|
||||
hdg_m_str = f"{self.heading_mag:.1f}"
|
||||
else:
|
||||
hdg_m_str = "0.0" # Default to 0.0 for parsers that require a value
|
||||
|
||||
# Format speed - water speed not yet available from Raymarine
|
||||
if self.speed_kts is not None:
|
||||
spd_kts_str = f"{self.speed_kts:.1f}"
|
||||
spd_kmh = self.speed_kts * 1.852
|
||||
spd_kmh_str = f"{spd_kmh:.1f}"
|
||||
else:
|
||||
spd_kts_str = "0.0" # Default to 0.0 for parsers that require a value
|
||||
spd_kmh_str = "0.0"
|
||||
|
||||
return (
|
||||
f"{hdg_t_str},T,"
|
||||
f"{hdg_m_str},M,"
|
||||
f"{spd_kts_str},N,"
|
||||
f"{spd_kmh_str},K"
|
||||
)
|
||||
82
axiom-nmea/raymarine_nmea/nmea/sentences/temperature.py
Normal file
82
axiom-nmea/raymarine_nmea/nmea/sentences/temperature.py
Normal file
@@ -0,0 +1,82 @@
|
||||
"""
|
||||
Temperature-related NMEA sentences.
|
||||
|
||||
MTW - Water Temperature
|
||||
MTA - Air Temperature (proprietary extension)
|
||||
"""
|
||||
|
||||
from typing import Optional
|
||||
|
||||
from ..sentence import NMEASentence
|
||||
|
||||
|
||||
class MTWSentence(NMEASentence):
|
||||
"""MTW - Water Temperature.
|
||||
|
||||
Format:
|
||||
$IIMTW,T.T,C*CC
|
||||
|
||||
Fields:
|
||||
1. Water temperature
|
||||
2. Unit (C = Celsius)
|
||||
|
||||
Example:
|
||||
$IIMTW,26.5,C*1D
|
||||
"""
|
||||
|
||||
talker_id = "II"
|
||||
sentence_type = "MTW"
|
||||
|
||||
def __init__(self, temp_c: Optional[float] = None):
|
||||
"""Initialize MTW sentence.
|
||||
|
||||
Args:
|
||||
temp_c: Water temperature in Celsius
|
||||
"""
|
||||
self.temp_c = temp_c
|
||||
|
||||
def format_fields(self) -> Optional[str]:
|
||||
"""Format MTW fields."""
|
||||
if self.temp_c is None:
|
||||
return None
|
||||
return f"{self.temp_c:.1f},C"
|
||||
|
||||
|
||||
class MTASentence(NMEASentence):
|
||||
"""MTA - Air Temperature.
|
||||
|
||||
Format:
|
||||
$IIMTA,T.T,C*CC
|
||||
|
||||
Fields:
|
||||
1. Air temperature
|
||||
2. Unit (C = Celsius)
|
||||
|
||||
Example:
|
||||
$IIMTA,24.8,C*0E
|
||||
|
||||
Note:
|
||||
MTA is not a standard NMEA sentence but is commonly used
|
||||
as a proprietary extension for air temperature, following
|
||||
the same format as MTW (water temperature).
|
||||
|
||||
Some devices may use XDR (transducer measurement) for air temp:
|
||||
$IIXDR,C,24.8,C,AirTemp*XX
|
||||
"""
|
||||
|
||||
talker_id = "II"
|
||||
sentence_type = "MTA"
|
||||
|
||||
def __init__(self, temp_c: Optional[float] = None):
|
||||
"""Initialize MTA sentence.
|
||||
|
||||
Args:
|
||||
temp_c: Air temperature in Celsius
|
||||
"""
|
||||
self.temp_c = temp_c
|
||||
|
||||
def format_fields(self) -> Optional[str]:
|
||||
"""Format MTA fields."""
|
||||
if self.temp_c is None:
|
||||
return None
|
||||
return f"{self.temp_c:.1f},C"
|
||||
215
axiom-nmea/raymarine_nmea/nmea/sentences/transducer.py
Normal file
215
axiom-nmea/raymarine_nmea/nmea/sentences/transducer.py
Normal file
@@ -0,0 +1,215 @@
|
||||
"""
|
||||
Transducer measurement NMEA sentences.
|
||||
|
||||
XDR - Transducer Measurements (generic)
|
||||
|
||||
Used for tanks, batteries, and other sensor data that doesn't have
|
||||
a dedicated NMEA sentence type.
|
||||
"""
|
||||
|
||||
from typing import Optional, List, Tuple
|
||||
|
||||
from ..sentence import NMEASentence
|
||||
|
||||
|
||||
class XDRSentence(NMEASentence):
|
||||
"""XDR - Transducer Measurements.
|
||||
|
||||
Format:
|
||||
$IIXDR,T,D.D,U,N,T,D.D,U,N,...*CC
|
||||
|
||||
Each transducer reading has 4 fields:
|
||||
1. Transducer type
|
||||
2. Data value
|
||||
3. Units
|
||||
4. Transducer name/ID
|
||||
|
||||
Transducer Types:
|
||||
A = Angular displacement (degrees)
|
||||
C = Temperature (Celsius)
|
||||
D = Depth (meters)
|
||||
F = Frequency (Hz)
|
||||
H = Humidity (percent)
|
||||
I = Current (amps)
|
||||
N = Force (Newtons)
|
||||
P = Pressure (Pascals)
|
||||
R = Flow Rate (liters/second)
|
||||
S = Salinity (ppt)
|
||||
T = Tachometer (RPM)
|
||||
U = Volume (liters)
|
||||
V = Voltage (volts)
|
||||
G = Generic (no units)
|
||||
|
||||
Tank Level Examples:
|
||||
$IIXDR,V,75.2,P,FUEL1,V,68.1,P,FUEL2*XX
|
||||
(P = percent for tank levels)
|
||||
|
||||
Battery Examples:
|
||||
$IIXDR,U,26.3,V,HOUSE1,U,27.2,V,HOUSE2*XX
|
||||
(U is sometimes used for voltage, or V for volts unit)
|
||||
|
||||
Note:
|
||||
XDR can contain multiple transducer readings in one sentence.
|
||||
Maximum sentence length is 82 characters, so multiple XDR
|
||||
sentences may be needed for many sensors.
|
||||
"""
|
||||
|
||||
talker_id = "II"
|
||||
sentence_type = "XDR"
|
||||
|
||||
def __init__(self, readings: Optional[List[Tuple[str, float, str, str]]] = None):
|
||||
"""Initialize XDR sentence.
|
||||
|
||||
Args:
|
||||
readings: List of (type, value, unit, name) tuples
|
||||
Example: [("V", 75.2, "P", "FUEL1"), ("V", 26.3, "V", "HOUSE1")]
|
||||
"""
|
||||
self.readings = readings or []
|
||||
|
||||
def add_reading(
|
||||
self,
|
||||
transducer_type: str,
|
||||
value: float,
|
||||
unit: str,
|
||||
name: str
|
||||
) -> None:
|
||||
"""Add a transducer reading.
|
||||
|
||||
Args:
|
||||
transducer_type: Type code (e.g., "V" for volume, "U" for voltage)
|
||||
value: Numeric value
|
||||
unit: Unit code (e.g., "P" for percent, "V" for volts)
|
||||
name: Transducer name/ID
|
||||
"""
|
||||
self.readings.append((transducer_type, value, unit, name))
|
||||
|
||||
def format_fields(self) -> Optional[str]:
|
||||
"""Format XDR fields."""
|
||||
if not self.readings:
|
||||
return None
|
||||
|
||||
parts = []
|
||||
for trans_type, value, unit, name in self.readings:
|
||||
# Clean the name to be NMEA-safe (no commas, asterisks)
|
||||
safe_name = name.replace(",", "_").replace("*", "_")
|
||||
# Use 4 decimal places for pressure (bar ~1.0209), 1 for others
|
||||
if trans_type == "P" and unit == "B":
|
||||
parts.append(f"{trans_type},{value:.4f},{unit},{safe_name}")
|
||||
else:
|
||||
parts.append(f"{trans_type},{value:.1f},{unit},{safe_name}")
|
||||
|
||||
return ",".join(parts)
|
||||
|
||||
@classmethod
|
||||
def for_tank(
|
||||
cls,
|
||||
tank_id: int,
|
||||
level_pct: float,
|
||||
name: Optional[str] = None
|
||||
) -> 'XDRSentence':
|
||||
"""Create XDR sentence for a tank level.
|
||||
|
||||
Args:
|
||||
tank_id: Tank ID number
|
||||
level_pct: Tank level in percent (0-100)
|
||||
name: Tank name (uses ID if not provided)
|
||||
|
||||
Returns:
|
||||
XDRSentence instance
|
||||
"""
|
||||
tank_name = name or f"TANK{tank_id}"
|
||||
xdr = cls()
|
||||
xdr.add_reading("V", level_pct, "P", tank_name)
|
||||
return xdr
|
||||
|
||||
@classmethod
|
||||
def for_battery(
|
||||
cls,
|
||||
battery_id: int,
|
||||
voltage: float,
|
||||
name: Optional[str] = None
|
||||
) -> 'XDRSentence':
|
||||
"""Create XDR sentence for a battery voltage.
|
||||
|
||||
Args:
|
||||
battery_id: Battery ID number
|
||||
voltage: Battery voltage in volts
|
||||
name: Battery name (uses ID if not provided)
|
||||
|
||||
Returns:
|
||||
XDRSentence instance
|
||||
"""
|
||||
battery_name = name or f"BATT{battery_id}"
|
||||
xdr = cls()
|
||||
xdr.add_reading("U", voltage, "V", battery_name)
|
||||
return xdr
|
||||
|
||||
@classmethod
|
||||
def for_tanks(
|
||||
cls,
|
||||
tanks: dict,
|
||||
names: Optional[dict] = None
|
||||
) -> 'XDRSentence':
|
||||
"""Create XDR sentence for multiple tanks.
|
||||
|
||||
Args:
|
||||
tanks: Dict of tank_id -> level_pct
|
||||
names: Optional dict of tank_id -> name
|
||||
|
||||
Returns:
|
||||
XDRSentence instance
|
||||
"""
|
||||
names = names or {}
|
||||
xdr = cls()
|
||||
for tank_id, level in sorted(tanks.items()):
|
||||
tank_name = names.get(tank_id, f"TANK{tank_id}")
|
||||
xdr.add_reading("V", level, "P", tank_name)
|
||||
return xdr
|
||||
|
||||
@classmethod
|
||||
def for_batteries(
|
||||
cls,
|
||||
batteries: dict,
|
||||
names: Optional[dict] = None
|
||||
) -> 'XDRSentence':
|
||||
"""Create XDR sentence for multiple batteries.
|
||||
|
||||
Args:
|
||||
batteries: Dict of battery_id -> voltage
|
||||
names: Optional dict of battery_id -> name
|
||||
|
||||
Returns:
|
||||
XDRSentence instance
|
||||
"""
|
||||
names = names or {}
|
||||
xdr = cls()
|
||||
for battery_id, voltage in sorted(batteries.items()):
|
||||
battery_name = names.get(battery_id, f"BATT{battery_id}")
|
||||
xdr.add_reading("U", voltage, "V", battery_name)
|
||||
return xdr
|
||||
|
||||
@classmethod
|
||||
def for_pressure(
|
||||
cls,
|
||||
pressure_mbar: float,
|
||||
name: str = "Barometer"
|
||||
) -> 'XDRSentence':
|
||||
"""Create XDR sentence for barometric pressure.
|
||||
|
||||
Args:
|
||||
pressure_mbar: Pressure in millibars (hPa)
|
||||
name: Sensor name (default: "Barometer")
|
||||
|
||||
Returns:
|
||||
XDRSentence instance
|
||||
|
||||
Note:
|
||||
Pressure is output in bar (1 bar = 1000 mbar) as per NMEA convention.
|
||||
Example: 1020.9 mbar = 1.0209 bar
|
||||
Output: $IIXDR,P,1.0209,B,Barometer*XX
|
||||
"""
|
||||
xdr = cls()
|
||||
# Convert mbar to bar (NMEA standard unit)
|
||||
pressure_bar = pressure_mbar / 1000.0
|
||||
xdr.add_reading("P", pressure_bar, "B", name)
|
||||
return xdr
|
||||
147
axiom-nmea/raymarine_nmea/nmea/sentences/wind.py
Normal file
147
axiom-nmea/raymarine_nmea/nmea/sentences/wind.py
Normal file
@@ -0,0 +1,147 @@
|
||||
"""
|
||||
Wind-related NMEA sentences.
|
||||
|
||||
MWV - Wind Speed and Angle
|
||||
MWD - Wind Direction and Speed
|
||||
"""
|
||||
|
||||
from typing import Optional
|
||||
|
||||
from ..sentence import NMEASentence
|
||||
|
||||
|
||||
class MWVSentence(NMEASentence):
|
||||
"""MWV - Wind Speed and Angle.
|
||||
|
||||
Format:
|
||||
$IIMWV,A.A,R,S.S,U,A*CC
|
||||
|
||||
Fields:
|
||||
1. Wind angle in degrees (0-360)
|
||||
2. Reference (R=Relative/Apparent, T=True)
|
||||
3. Wind speed
|
||||
4. Wind speed units (K=km/h, M=m/s, N=knots, S=statute mph)
|
||||
5. Status (A=valid, V=invalid)
|
||||
|
||||
Example:
|
||||
$IIMWV,045.0,R,12.5,N,A*28
|
||||
|
||||
Note:
|
||||
MWV can represent either apparent or true wind:
|
||||
- Relative (R): Wind angle relative to bow, clockwise
|
||||
- True (T): True wind direction relative to bow
|
||||
"""
|
||||
|
||||
talker_id = "II"
|
||||
sentence_type = "MWV"
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
angle: Optional[float] = None,
|
||||
reference: str = "R",
|
||||
speed: Optional[float] = None,
|
||||
speed_units: str = "N",
|
||||
status: str = "A",
|
||||
):
|
||||
"""Initialize MWV sentence.
|
||||
|
||||
Args:
|
||||
angle: Wind angle in degrees (0-360)
|
||||
reference: R=Relative/Apparent, T=True
|
||||
speed: Wind speed
|
||||
speed_units: K=km/h, M=m/s, N=knots, S=mph
|
||||
status: A=valid, V=invalid
|
||||
"""
|
||||
self.angle = angle
|
||||
self.reference = reference
|
||||
self.speed = speed
|
||||
self.speed_units = speed_units
|
||||
self.status = status
|
||||
|
||||
def format_fields(self) -> Optional[str]:
|
||||
"""Format MWV fields."""
|
||||
# Need at least angle or speed
|
||||
if self.angle is None and self.speed is None:
|
||||
return None
|
||||
|
||||
angle_str = f"{self.angle:.1f}" if self.angle is not None else ""
|
||||
speed_str = f"{self.speed:.1f}" if self.speed is not None else ""
|
||||
|
||||
return (
|
||||
f"{angle_str},"
|
||||
f"{self.reference},"
|
||||
f"{speed_str},"
|
||||
f"{self.speed_units},"
|
||||
f"{self.status}"
|
||||
)
|
||||
|
||||
|
||||
class MWDSentence(NMEASentence):
|
||||
"""MWD - Wind Direction and Speed.
|
||||
|
||||
Format:
|
||||
$IIMWD,D.D,T,D.D,M,S.S,N,S.S,M*CC
|
||||
|
||||
Fields:
|
||||
1. Wind direction (true) in degrees
|
||||
2. T = True
|
||||
3. Wind direction (magnetic) in degrees
|
||||
4. M = Magnetic
|
||||
5. Wind speed in knots
|
||||
6. N = Knots
|
||||
7. Wind speed in m/s
|
||||
8. M = Meters/second
|
||||
|
||||
Example:
|
||||
$IIMWD,270.0,T,258.0,M,12.5,N,6.4,M*5A
|
||||
|
||||
Note:
|
||||
MWD provides true wind direction (the direction FROM which wind blows),
|
||||
expressed as a compass bearing, not relative to the vessel.
|
||||
"""
|
||||
|
||||
talker_id = "II"
|
||||
sentence_type = "MWD"
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
direction_true: Optional[float] = None,
|
||||
direction_mag: Optional[float] = None,
|
||||
speed_kts: Optional[float] = None,
|
||||
):
|
||||
"""Initialize MWD sentence.
|
||||
|
||||
Args:
|
||||
direction_true: True wind direction in degrees (from which wind blows)
|
||||
direction_mag: Magnetic wind direction in degrees
|
||||
speed_kts: Wind speed in knots
|
||||
"""
|
||||
self.direction_true = direction_true
|
||||
self.direction_mag = direction_mag
|
||||
self.speed_kts = speed_kts
|
||||
|
||||
def format_fields(self) -> Optional[str]:
|
||||
"""Format MWD fields."""
|
||||
# Need at least direction or speed
|
||||
if self.direction_true is None and self.speed_kts is None:
|
||||
return None
|
||||
|
||||
# Format directions
|
||||
dir_t_str = f"{self.direction_true:.1f}" if self.direction_true is not None else ""
|
||||
dir_m_str = f"{self.direction_mag:.1f}" if self.direction_mag is not None else ""
|
||||
|
||||
# Format speeds
|
||||
if self.speed_kts is not None:
|
||||
spd_kts_str = f"{self.speed_kts:.1f}"
|
||||
spd_ms = self.speed_kts / 1.94384449
|
||||
spd_ms_str = f"{spd_ms:.1f}"
|
||||
else:
|
||||
spd_kts_str = ""
|
||||
spd_ms_str = ""
|
||||
|
||||
return (
|
||||
f"{dir_t_str},T,"
|
||||
f"{dir_m_str},M,"
|
||||
f"{spd_kts_str},N,"
|
||||
f"{spd_ms_str},M"
|
||||
)
|
||||
381
axiom-nmea/raymarine_nmea/nmea/server.py
Normal file
381
axiom-nmea/raymarine_nmea/nmea/server.py
Normal file
@@ -0,0 +1,381 @@
|
||||
"""
|
||||
NMEA TCP Server.
|
||||
|
||||
Provides a TCP server that broadcasts NMEA 0183 sentences to connected clients.
|
||||
This is useful for feeding navigation apps, charting software, and SignalK.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
import socket
|
||||
import threading
|
||||
from typing import List, Optional, Callable, Set
|
||||
|
||||
from ..data.store import SensorData
|
||||
from .generator import NMEAGenerator
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class NMEATcpServer:
|
||||
"""TCP server that broadcasts NMEA sentences to connected clients.
|
||||
|
||||
This server accepts TCP connections and broadcasts NMEA 0183 sentences
|
||||
to all connected clients. It uses asyncio internally for robust handling
|
||||
of multiple concurrent clients - a slow client won't block others.
|
||||
|
||||
The server publishes ALL available NMEA sentences, which is more data
|
||||
than what Venus OS can display via D-Bus. This includes:
|
||||
- GPS: GGA, GLL, RMC
|
||||
- Navigation: HDG, HDT, VTG, VHW
|
||||
- Wind: MWV (apparent & true), MWD
|
||||
- Depth: DPT, DBT
|
||||
- Temperature: MTW, MTA
|
||||
- Transducers: XDR (tanks, batteries, pressure)
|
||||
|
||||
Example:
|
||||
from raymarine_nmea import SensorData, NMEAGenerator
|
||||
from raymarine_nmea.nmea import NMEATcpServer
|
||||
|
||||
sensor_data = SensorData()
|
||||
server = NMEATcpServer(sensor_data, port=10110)
|
||||
server.start()
|
||||
|
||||
# Later, in your update loop:
|
||||
server.broadcast()
|
||||
|
||||
# When done:
|
||||
server.stop()
|
||||
|
||||
Thread Safety:
|
||||
This class is thread-safe. The broadcast() method can be called
|
||||
from any thread, and client connections are managed safely via
|
||||
asyncio running in a background thread.
|
||||
"""
|
||||
|
||||
# Default NMEA TCP port (standard)
|
||||
DEFAULT_PORT = 10110
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
sensor_data: SensorData,
|
||||
port: int = DEFAULT_PORT,
|
||||
generator: Optional[NMEAGenerator] = None,
|
||||
on_client_connect: Optional[Callable[[str, int], None]] = None,
|
||||
on_client_disconnect: Optional[Callable[[str, int], None]] = None,
|
||||
):
|
||||
"""Initialize the NMEA TCP server.
|
||||
|
||||
Args:
|
||||
sensor_data: SensorData instance to read values from
|
||||
port: TCP port to listen on (default: 10110)
|
||||
generator: NMEAGenerator instance (creates default if None)
|
||||
on_client_connect: Callback when client connects (addr, port)
|
||||
on_client_disconnect: Callback when client disconnects (addr, port)
|
||||
"""
|
||||
self._sensor_data = sensor_data
|
||||
self._port = port
|
||||
self._generator = generator or NMEAGenerator()
|
||||
|
||||
self._on_client_connect = on_client_connect
|
||||
self._on_client_disconnect = on_client_disconnect
|
||||
|
||||
self._running = False
|
||||
self._loop: Optional[asyncio.AbstractEventLoop] = None
|
||||
self._thread: Optional[threading.Thread] = None
|
||||
self._server: Optional[asyncio.Server] = None
|
||||
|
||||
# Client tracking (accessed from asyncio thread)
|
||||
self._clients: Set[asyncio.StreamWriter] = set()
|
||||
self._client_addrs: dict = {} # writer -> (addr, port)
|
||||
|
||||
# Thread-safe counter for client count
|
||||
self._client_count = 0
|
||||
self._client_count_lock = threading.Lock()
|
||||
|
||||
@property
|
||||
def port(self) -> int:
|
||||
"""Get the TCP port."""
|
||||
return self._port
|
||||
|
||||
@property
|
||||
def client_count(self) -> int:
|
||||
"""Get the number of connected clients."""
|
||||
with self._client_count_lock:
|
||||
return self._client_count
|
||||
|
||||
@property
|
||||
def is_running(self) -> bool:
|
||||
"""Check if server is running."""
|
||||
return self._running
|
||||
|
||||
def start(self) -> bool:
|
||||
"""Start the TCP server.
|
||||
|
||||
Returns:
|
||||
True if server started successfully, False otherwise
|
||||
"""
|
||||
if self._running:
|
||||
logger.warning("NMEATcpServer already running")
|
||||
return True
|
||||
|
||||
# Create and start the asyncio event loop in a background thread
|
||||
self._running = True
|
||||
|
||||
# Use an event to wait for server to be ready
|
||||
ready_event = threading.Event()
|
||||
startup_error = [None] # Use list to allow modification in nested function
|
||||
|
||||
def run_loop():
|
||||
try:
|
||||
self._loop = asyncio.new_event_loop()
|
||||
asyncio.set_event_loop(self._loop)
|
||||
|
||||
# Start the server
|
||||
self._loop.run_until_complete(self._start_server(ready_event, startup_error))
|
||||
|
||||
# Run until stopped
|
||||
if self._running and not startup_error[0]:
|
||||
self._loop.run_forever()
|
||||
except Exception as e:
|
||||
startup_error[0] = e
|
||||
ready_event.set()
|
||||
finally:
|
||||
# Cleanup
|
||||
if self._loop:
|
||||
try:
|
||||
self._loop.run_until_complete(self._cleanup())
|
||||
except Exception:
|
||||
pass
|
||||
self._loop.close()
|
||||
self._loop = None
|
||||
|
||||
self._thread = threading.Thread(
|
||||
target=run_loop,
|
||||
daemon=True,
|
||||
name="NMEATcpServer-AsyncIO"
|
||||
)
|
||||
self._thread.start()
|
||||
|
||||
# Wait for server to be ready (with timeout)
|
||||
if not ready_event.wait(timeout=5.0):
|
||||
logger.error("Timeout waiting for NMEA TCP server to start")
|
||||
self._running = False
|
||||
return False
|
||||
|
||||
if startup_error[0]:
|
||||
logger.error(f"Failed to start NMEA TCP server: {startup_error[0]}")
|
||||
self._running = False
|
||||
return False
|
||||
|
||||
logger.info(f"NMEA TCP server listening on port {self._port}")
|
||||
return True
|
||||
|
||||
async def _start_server(self, ready_event: threading.Event, startup_error: list) -> None:
|
||||
"""Start the asyncio TCP server."""
|
||||
try:
|
||||
self._server = await asyncio.start_server(
|
||||
self._handle_client,
|
||||
'',
|
||||
self._port,
|
||||
reuse_address=True,
|
||||
)
|
||||
ready_event.set()
|
||||
except OSError as e:
|
||||
startup_error[0] = e
|
||||
ready_event.set()
|
||||
|
||||
async def _handle_client(
|
||||
self,
|
||||
reader: asyncio.StreamReader,
|
||||
writer: asyncio.StreamWriter
|
||||
) -> None:
|
||||
"""Handle a client connection."""
|
||||
addr = writer.get_extra_info('peername')
|
||||
addr_tuple = (addr[0], addr[1]) if addr else ('unknown', 0)
|
||||
|
||||
# Configure socket for optimal NMEA streaming
|
||||
sock = writer.get_extra_info('socket')
|
||||
if sock:
|
||||
self._configure_socket(sock)
|
||||
|
||||
# Track client
|
||||
self._clients.add(writer)
|
||||
self._client_addrs[writer] = addr_tuple
|
||||
with self._client_count_lock:
|
||||
self._client_count += 1
|
||||
|
||||
logger.info(f"NMEA TCP client connected: {addr_tuple[0]}:{addr_tuple[1]}")
|
||||
|
||||
# Callback
|
||||
if self._on_client_connect:
|
||||
try:
|
||||
self._on_client_connect(addr_tuple[0], addr_tuple[1])
|
||||
except Exception as e:
|
||||
logger.debug(f"Client connect callback error: {e}")
|
||||
|
||||
try:
|
||||
# Keep connection alive until client disconnects or server stops
|
||||
while self._running:
|
||||
try:
|
||||
# Check if client is still connected
|
||||
data = await asyncio.wait_for(reader.read(1), timeout=5.0)
|
||||
if not data:
|
||||
# Client disconnected cleanly
|
||||
break
|
||||
except asyncio.TimeoutError:
|
||||
# No data, but connection still alive
|
||||
continue
|
||||
except (ConnectionResetError, BrokenPipeError):
|
||||
break
|
||||
except Exception as e:
|
||||
logger.debug(f"Client {addr_tuple[0]}:{addr_tuple[1]} error: {e}")
|
||||
finally:
|
||||
await self._remove_client(writer)
|
||||
|
||||
def _configure_socket(self, sock: socket.socket) -> None:
|
||||
"""Configure client socket for optimal NMEA streaming."""
|
||||
# Enable TCP keepalive to detect dead connections
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
|
||||
|
||||
# Platform-specific keepalive settings (Linux)
|
||||
if hasattr(socket, 'TCP_KEEPIDLE'):
|
||||
sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPIDLE, 10)
|
||||
sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPINTVL, 5)
|
||||
sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPCNT, 3)
|
||||
|
||||
# Disable Nagle's algorithm for lower latency
|
||||
sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
|
||||
|
||||
async def _remove_client(self, writer: asyncio.StreamWriter) -> None:
|
||||
"""Remove a client from tracking."""
|
||||
if writer not in self._clients:
|
||||
return
|
||||
|
||||
self._clients.discard(writer)
|
||||
addr = self._client_addrs.pop(writer, None)
|
||||
|
||||
with self._client_count_lock:
|
||||
self._client_count = max(0, self._client_count - 1)
|
||||
|
||||
try:
|
||||
writer.close()
|
||||
await writer.wait_closed()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if addr:
|
||||
logger.info(f"NMEA TCP client disconnected: {addr[0]}:{addr[1]}")
|
||||
|
||||
# Callback
|
||||
if self._on_client_disconnect:
|
||||
try:
|
||||
self._on_client_disconnect(addr[0], addr[1])
|
||||
except Exception as e:
|
||||
logger.debug(f"Client disconnect callback error: {e}")
|
||||
|
||||
async def _cleanup(self) -> None:
|
||||
"""Clean up server resources."""
|
||||
# Close server
|
||||
if self._server:
|
||||
self._server.close()
|
||||
await self._server.wait_closed()
|
||||
self._server = None
|
||||
|
||||
# Close all clients
|
||||
for writer in list(self._clients):
|
||||
await self._remove_client(writer)
|
||||
|
||||
def stop(self) -> None:
|
||||
"""Stop the TCP server and disconnect all clients."""
|
||||
if not self._running:
|
||||
return
|
||||
|
||||
self._running = False
|
||||
|
||||
# Stop the asyncio event loop
|
||||
if self._loop and self._loop.is_running():
|
||||
self._loop.call_soon_threadsafe(self._loop.stop)
|
||||
|
||||
# Wait for thread to finish
|
||||
if self._thread and self._thread.is_alive():
|
||||
self._thread.join(timeout=5.0)
|
||||
self._thread = None
|
||||
|
||||
logger.info("NMEA TCP server stopped")
|
||||
|
||||
def broadcast(self) -> int:
|
||||
"""Generate and broadcast NMEA sentences to all connected clients.
|
||||
|
||||
This method generates all NMEA sentences from the current sensor data
|
||||
and sends them to all connected clients. Each client is sent data
|
||||
independently, so a slow client won't block others.
|
||||
|
||||
Returns:
|
||||
Number of sentences broadcast (0 if no clients or no data)
|
||||
"""
|
||||
if not self._running or not self._loop:
|
||||
return 0
|
||||
|
||||
# Check if we have clients
|
||||
if self.client_count == 0:
|
||||
return 0
|
||||
|
||||
# Generate NMEA sentences
|
||||
sentences = self._generator.generate_all(self._sensor_data)
|
||||
if not sentences:
|
||||
return 0
|
||||
|
||||
# Encode data
|
||||
data = ''.join(sentences).encode('ascii')
|
||||
|
||||
# Schedule broadcast on the asyncio event loop
|
||||
try:
|
||||
asyncio.run_coroutine_threadsafe(
|
||||
self._broadcast_async(data),
|
||||
self._loop
|
||||
)
|
||||
except RuntimeError:
|
||||
# Loop not running
|
||||
return 0
|
||||
|
||||
return len(sentences)
|
||||
|
||||
async def _broadcast_async(self, data: bytes) -> None:
|
||||
"""Broadcast data to all clients asynchronously."""
|
||||
if not self._clients:
|
||||
return
|
||||
|
||||
# Send to all clients concurrently with timeout
|
||||
async def send_to_client(writer: asyncio.StreamWriter) -> bool:
|
||||
"""Send data to a single client. Returns False if client is dead."""
|
||||
try:
|
||||
writer.write(data)
|
||||
# Use wait_for to timeout slow clients
|
||||
await asyncio.wait_for(writer.drain(), timeout=2.0)
|
||||
return True
|
||||
except (asyncio.TimeoutError, ConnectionResetError, BrokenPipeError, OSError):
|
||||
return False
|
||||
except Exception as e:
|
||||
logger.debug(f"Send error: {e}")
|
||||
return False
|
||||
|
||||
# Create tasks for all clients
|
||||
clients = list(self._clients)
|
||||
results = await asyncio.gather(
|
||||
*[send_to_client(writer) for writer in clients],
|
||||
return_exceptions=True
|
||||
)
|
||||
|
||||
# Remove dead clients
|
||||
for writer, success in zip(clients, results):
|
||||
if success is False or isinstance(success, Exception):
|
||||
await self._remove_client(writer)
|
||||
|
||||
def get_client_addresses(self) -> List[tuple]:
|
||||
"""Get list of connected client addresses.
|
||||
|
||||
Returns:
|
||||
List of (host, port) tuples for all connected clients
|
||||
"""
|
||||
return list(self._client_addrs.values())
|
||||
32
axiom-nmea/raymarine_nmea/protocol/__init__.py
Normal file
32
axiom-nmea/raymarine_nmea/protocol/__init__.py
Normal file
@@ -0,0 +1,32 @@
|
||||
"""
|
||||
Protocol module for parsing Raymarine LightHouse protobuf format.
|
||||
"""
|
||||
|
||||
from .constants import (
|
||||
WIRE_VARINT,
|
||||
WIRE_FIXED64,
|
||||
WIRE_LENGTH,
|
||||
WIRE_FIXED32,
|
||||
HEADER_SIZE,
|
||||
RAD_TO_DEG,
|
||||
MS_TO_KTS,
|
||||
FEET_TO_M,
|
||||
KELVIN_OFFSET,
|
||||
)
|
||||
from .parser import ProtobufParser, ProtoField
|
||||
from .decoder import RaymarineDecoder
|
||||
|
||||
__all__ = [
|
||||
"WIRE_VARINT",
|
||||
"WIRE_FIXED64",
|
||||
"WIRE_LENGTH",
|
||||
"WIRE_FIXED32",
|
||||
"HEADER_SIZE",
|
||||
"RAD_TO_DEG",
|
||||
"MS_TO_KTS",
|
||||
"FEET_TO_M",
|
||||
"KELVIN_OFFSET",
|
||||
"ProtobufParser",
|
||||
"ProtoField",
|
||||
"RaymarineDecoder",
|
||||
]
|
||||
391
axiom-nmea/raymarine_nmea/protocol/constants.py
Normal file
391
axiom-nmea/raymarine_nmea/protocol/constants.py
Normal file
@@ -0,0 +1,391 @@
|
||||
"""
|
||||
Protocol constants for Raymarine LightHouse decoding.
|
||||
|
||||
This module defines the reverse-engineered protobuf schema for Raymarine's
|
||||
LightHouse network protocol. The protocol uses Google Protocol Buffers over
|
||||
UDP multicast (226.192.206.102:2565) with a 20-byte proprietary header.
|
||||
|
||||
================================================================================
|
||||
PACKET STRUCTURE
|
||||
================================================================================
|
||||
|
||||
┌──────────────────────────────────────────────────────────────────────────┐
|
||||
│ Bytes 0-19: Raymarine Header (20 bytes) │
|
||||
├──────────────────────────────────────────────────────────────────────────┤
|
||||
│ Bytes 20+: Protobuf Payload (variable length) │
|
||||
└──────────────────────────────────────────────────────────────────────────┘
|
||||
|
||||
================================================================================
|
||||
MESSAGE HIERARCHY (Proto-like Schema)
|
||||
================================================================================
|
||||
|
||||
LightHousePacket {
|
||||
1: DeviceInfo device_info [message]
|
||||
2: GpsPosition gps [message] ✓ RELIABLE
|
||||
3: HeadingData heading [message] ~ VARIABLE
|
||||
5: Navigation navigation [message] ✓ RELIABLE
|
||||
7: DepthData depth [message] ~ VARIABLE
|
||||
13: WindData wind [message] ~ VARIABLE
|
||||
14: EngineData[] engines [repeated] ✓ RELIABLE
|
||||
15: EnvironmentData environment [message] ✓ RELIABLE
|
||||
16: TankData[] tanks [repeated] ✓ RELIABLE
|
||||
20: BatteryData[] house_batteries [repeated] ✓ RELIABLE
|
||||
}
|
||||
|
||||
GpsPosition (Field 2) {
|
||||
1: double latitude [fixed64] // Decimal degrees, -90 to +90
|
||||
2: double longitude [fixed64] // Decimal degrees, -180 to +180
|
||||
}
|
||||
|
||||
HeadingData (Field 3) {
|
||||
1: float cog [fixed32] // Course over ground (radians)
|
||||
2: float heading [fixed32] // Compass heading (radians)
|
||||
}
|
||||
|
||||
Navigation (Field 5) {
|
||||
1: float cog [fixed32] // Course over ground (radians)
|
||||
3: float sog [fixed32] // Speed over ground (m/s)
|
||||
}
|
||||
|
||||
DepthData (Field 7) {
|
||||
1: float depth [fixed32] // Depth in meters
|
||||
}
|
||||
|
||||
WindData (Field 13) {
|
||||
4: float twd [fixed32] // True wind direction (radians)
|
||||
5: float tws [fixed32] // True wind speed (m/s)
|
||||
6: float aws [fixed32] // Apparent wind speed (m/s)
|
||||
}
|
||||
|
||||
EngineData (Field 14, repeated) {
|
||||
1: int32 engine_id [varint] // 0=Port, 1=Starboard
|
||||
3: EngineSensors sensors [message]
|
||||
}
|
||||
|
||||
EngineSensors (Field 14.3) {
|
||||
4: float battery_voltage [fixed32] // Volts
|
||||
}
|
||||
|
||||
EnvironmentData (Field 15) {
|
||||
1: float pressure [fixed32] // Barometric pressure (Pascals)
|
||||
3: float air_temp [fixed32] // Air temperature (Kelvin)
|
||||
9: float water_temp [fixed32] // Water temperature (Kelvin)
|
||||
}
|
||||
|
||||
TankData (Field 16, repeated) {
|
||||
1: int32 tank_id [varint] // Tank identifier (see inference)
|
||||
2: int32 status [varint] // Tank type/status flag
|
||||
3: float level [fixed32] // Fill percentage (0-100)
|
||||
}
|
||||
|
||||
BatteryData (Field 20, repeated) {
|
||||
1: int32 battery_id [varint] // Battery identifier
|
||||
3: float voltage [fixed32] // Volts
|
||||
}
|
||||
|
||||
================================================================================
|
||||
INFERENCE RULES (Field-Presence Logic)
|
||||
================================================================================
|
||||
|
||||
The protocol does NOT include explicit message type identifiers. Instead,
|
||||
the presence or absence of fields determines meaning:
|
||||
|
||||
TANK ID INFERENCE (Field 16):
|
||||
┌─────────────────────────────────────────────────────────────────────────┐
|
||||
│ If tank_id (16.1) is ABSENT: │
|
||||
│ - If status == 5 (WASTE) → tank_id = 100 (Black/Gray Water) │
|
||||
│ - If status is ABSENT → tank_id = 2 (Port Fuel) │
|
||||
│ │
|
||||
│ Port Fuel is the ONLY tank that transmits with neither ID nor status. │
|
||||
└─────────────────────────────────────────────────────────────────────────┘
|
||||
|
||||
ENGINE ID INFERENCE (Field 14):
|
||||
┌─────────────────────────────────────────────────────────────────────────┐
|
||||
│ If engine_id (14.1) is ABSENT: │
|
||||
│ - Default to engine_id = 0 (Port Engine) │
|
||||
│ │
|
||||
│ Starboard engine explicitly sends engine_id = 1. │
|
||||
└─────────────────────────────────────────────────────────────────────────┘
|
||||
|
||||
================================================================================
|
||||
WIRE TYPES (Protobuf Encoding)
|
||||
================================================================================
|
||||
|
||||
Protobuf encodes each field with a tag: (field_number << 3) | wire_type
|
||||
|
||||
Type 0 (VARINT): Variable-length integers (IDs, counts, enums)
|
||||
Type 1 (FIXED64): 8-byte values (doubles for GPS coordinates)
|
||||
Type 2 (LENGTH): Length-delimited (nested messages, strings)
|
||||
Type 5 (FIXED32): 4-byte values (floats for angles, speeds, voltages)
|
||||
|
||||
================================================================================
|
||||
UNIT CONVENTIONS
|
||||
================================================================================
|
||||
|
||||
Angles: Radians (0 to 2π) → Convert with RAD_TO_DEG
|
||||
Speed: Meters/second → Convert with MS_TO_KTS
|
||||
Temperature: Kelvin → Subtract KELVIN_OFFSET for Celsius
|
||||
Pressure: Pascals → Multiply by PA_TO_MBAR for millibars
|
||||
Depth: Meters → Divide by FEET_TO_M for feet
|
||||
Voltage: Volts (direct)
|
||||
Tank Level: Percentage (0-100)
|
||||
|
||||
================================================================================
|
||||
"""
|
||||
|
||||
# ==============================================================================
|
||||
# WIRE TYPES
|
||||
# ==============================================================================
|
||||
|
||||
WIRE_VARINT = 0 # Variable-length integers (IDs, counts, enums)
|
||||
WIRE_FIXED64 = 1 # 8-byte values (GPS coordinates as doubles)
|
||||
WIRE_LENGTH = 2 # Length-delimited (nested messages, strings)
|
||||
WIRE_FIXED32 = 5 # 4-byte values (angles, speeds, voltages as floats)
|
||||
|
||||
# Raymarine packet header size (bytes before protobuf payload)
|
||||
HEADER_SIZE = 20
|
||||
|
||||
|
||||
# ==============================================================================
|
||||
# TOP-LEVEL FIELDS
|
||||
# ==============================================================================
|
||||
|
||||
class Fields:
|
||||
"""Top-level protobuf field numbers in LightHousePacket.
|
||||
|
||||
All top-level fields are length-delimited messages (wire type 2).
|
||||
Fields marked [repeated] appear multiple times for multiple instances.
|
||||
"""
|
||||
DEVICE_INFO = 1 # DeviceInfo - Device name and serial
|
||||
GPS_POSITION = 2 # GpsPosition - Latitude/longitude (reliable)
|
||||
HEADING = 3 # HeadingData - Compass heading (variable)
|
||||
SOG_COG = 5 # Navigation - Speed/course over ground (reliable)
|
||||
DEPTH = 7 # DepthData - Water depth (variable, large packets)
|
||||
WIND_NAVIGATION = 13 # WindData - Wind speed/direction (variable)
|
||||
ENGINE_DATA = 14 # EngineData[] - [repeated] Engine sensors + battery
|
||||
TEMPERATURE = 15 # EnvironmentData - Temp/pressure (reliable)
|
||||
TANK_DATA = 16 # TankData[] - [repeated] Tank levels (reliable)
|
||||
HOUSE_BATTERY = 20 # BatteryData[] - [repeated] House batteries (reliable)
|
||||
|
||||
|
||||
# ==============================================================================
|
||||
# NESTED FIELD DEFINITIONS
|
||||
# ==============================================================================
|
||||
|
||||
class GPSFields:
|
||||
"""Field 2: GpsPosition - GPS coordinates.
|
||||
|
||||
Reliability: ✓ HIGH
|
||||
Wire types: All fields are fixed64 (8-byte doubles)
|
||||
|
||||
Example values:
|
||||
latitude: 26.123456 (decimal degrees)
|
||||
longitude: -80.654321 (decimal degrees)
|
||||
"""
|
||||
LATITUDE = 1 # fixed64/double - Decimal degrees, -90 to +90
|
||||
LONGITUDE = 2 # fixed64/double - Decimal degrees, -180 to +180
|
||||
COG_RAD = 3 # fixed64/double - Course over ground (radians), NaN when stationary
|
||||
SOG_MS = 4 # fixed32/float - Speed over ground (m/s)
|
||||
|
||||
|
||||
class HeadingFields:
|
||||
"""Field 3: HeadingData - Compass and course data.
|
||||
|
||||
Reliability: ~ VARIABLE (context-dependent)
|
||||
Wire types: All fields are fixed32 (4-byte floats)
|
||||
"""
|
||||
COG_RAD = 1 # fixed32/float - Course over ground (radians)
|
||||
HEADING_RAD = 2 # fixed32/float - Compass heading (radians, 0 to 2π)
|
||||
|
||||
|
||||
class SOGCOGFields:
|
||||
"""Field 5: Navigation - GPS-derived speed and course.
|
||||
|
||||
Reliability: ✓ HIGH
|
||||
Wire types: All fields are fixed32 (4-byte floats)
|
||||
Packet size: Found in 86-92 byte packets
|
||||
|
||||
This is the PRIMARY source for SOG/COG data.
|
||||
"""
|
||||
COG_RAD = 1 # fixed32/float - Course over ground (radians)
|
||||
# Field 2 exists but purpose unknown
|
||||
SOG_MS = 3 # fixed32/float - Speed over ground (m/s)
|
||||
|
||||
|
||||
class DepthFields:
|
||||
"""Field 7: DepthData - Water depth.
|
||||
|
||||
Reliability: ~ VARIABLE (only in larger packets)
|
||||
Wire types: fixed32 (4-byte float)
|
||||
"""
|
||||
DEPTH_METERS = 1 # fixed32/float - Depth in meters
|
||||
|
||||
|
||||
class WindFields:
|
||||
"""Field 13: WindData - Wind speed and direction.
|
||||
|
||||
Reliability: ~ VARIABLE (depends on wind sensor availability)
|
||||
Wire types: All fields are fixed32 (4-byte floats)
|
||||
|
||||
Note: Fields 1-3 exist but purpose unknown.
|
||||
"""
|
||||
# Fields 1-3 unknown
|
||||
TRUE_WIND_DIRECTION = 4 # fixed32/float - TWD in radians
|
||||
TRUE_WIND_SPEED = 5 # fixed32/float - TWS in m/s
|
||||
APPARENT_WIND_SPEED = 6 # fixed32/float - AWS in m/s
|
||||
|
||||
|
||||
class TemperatureFields:
|
||||
"""Field 15: EnvironmentData - Temperature and pressure sensors.
|
||||
|
||||
Reliability: ✓ HIGH
|
||||
Wire types: All fields are fixed32 (4-byte floats)
|
||||
|
||||
Note: Fields 2, 4-8 exist but purpose unknown.
|
||||
"""
|
||||
BAROMETRIC_PRESSURE = 1 # fixed32/float - Pascals (divide by 100 for mbar)
|
||||
# Field 2 unknown
|
||||
AIR_TEMP = 3 # fixed32/float - Kelvin (subtract 273.15 for °C)
|
||||
# Fields 4-8 unknown
|
||||
WATER_TEMP = 9 # fixed32/float - Kelvin (subtract 273.15 for °C)
|
||||
|
||||
|
||||
class TankFields:
|
||||
"""Field 16: TankData - Tank level sensors (repeated message).
|
||||
|
||||
Reliability: ✓ HIGH
|
||||
Wire types: Mixed (see individual fields)
|
||||
|
||||
INFERENCE RULE:
|
||||
If tank_id is ABSENT:
|
||||
- status == 5 (WASTE) → tank_id = 100 (Black/Gray Water)
|
||||
- status is ABSENT → tank_id = 2 (Port Fuel, unique case)
|
||||
"""
|
||||
TANK_ID = 1 # varint/int32 - Tank identifier (may be absent, see inference)
|
||||
STATUS = 2 # varint/int32 - Tank type flag (5 = waste tank)
|
||||
LEVEL_PCT = 3 # fixed32/float - Fill percentage (0-100)
|
||||
|
||||
|
||||
class BatteryFields:
|
||||
"""Field 20: BatteryData - House battery sensors (repeated message).
|
||||
|
||||
Reliability: ✓ HIGH
|
||||
Wire types: Mixed (see individual fields)
|
||||
|
||||
Known battery IDs: 11, 13 (house batteries)
|
||||
"""
|
||||
BATTERY_ID = 1 # varint/int32 - Battery identifier
|
||||
# Field 2 unknown
|
||||
VOLTAGE = 3 # fixed32/float - Voltage in volts
|
||||
|
||||
|
||||
class EngineFields:
|
||||
"""Field 14: EngineData - Engine sensors (repeated message).
|
||||
|
||||
Reliability: ✓ HIGH
|
||||
Wire types: Mixed (see individual fields)
|
||||
|
||||
Structure: 3 levels of nesting
|
||||
Field 14 (EngineData)
|
||||
└─ Field 14.3 (EngineSensors message)
|
||||
└─ Field 14.3.4 (battery_voltage float)
|
||||
|
||||
INFERENCE RULE:
|
||||
If engine_id is ABSENT → default to 0 (Port Engine)
|
||||
Starboard engine explicitly sends engine_id = 1
|
||||
|
||||
Battery IDs: Stored as 1000 + engine_id (1000=Port, 1001=Starboard)
|
||||
"""
|
||||
ENGINE_ID = 1 # varint/int32 - Engine ID (0=Port, 1=Starboard)
|
||||
# Field 2 unknown
|
||||
SENSOR_DATA = 3 # message - EngineSensors (nested message)
|
||||
|
||||
# Within SENSOR_DATA (Field 14.3):
|
||||
# Fields 1-3 unknown
|
||||
BATTERY_VOLTAGE = 4 # fixed32/float - Battery voltage in volts
|
||||
|
||||
# ==============================================================================
|
||||
# UNIT CONVERSIONS
|
||||
# ==============================================================================
|
||||
# Raymarine uses SI units internally. These convert to common marine units.
|
||||
|
||||
RAD_TO_DEG = 57.2957795131 # radians → degrees (180/π)
|
||||
MS_TO_KTS = 1.94384449 # m/s → knots
|
||||
FEET_TO_M = 0.3048 # feet → meters
|
||||
KELVIN_OFFSET = 273.15 # Kelvin → Celsius (subtract)
|
||||
FATHOMS_TO_M = 1.8288 # fathoms → meters
|
||||
PA_TO_MBAR = 0.01 # Pascals → millibars (hPa)
|
||||
|
||||
|
||||
# ==============================================================================
|
||||
# VALIDATION RANGES
|
||||
# ==============================================================================
|
||||
# Values outside these ranges are rejected as invalid/corrupt data.
|
||||
|
||||
class ValidationRanges:
|
||||
"""Valid ranges for decoded sensor values.
|
||||
|
||||
These ranges filter out corrupt or invalid data. Values outside
|
||||
these bounds are discarded during decoding.
|
||||
"""
|
||||
|
||||
# -------------------------------------------------------------------------
|
||||
# GPS Position
|
||||
# -------------------------------------------------------------------------
|
||||
LATITUDE_MIN = -90.0
|
||||
LATITUDE_MAX = 90.0
|
||||
LONGITUDE_MIN = -180.0
|
||||
LONGITUDE_MAX = 180.0
|
||||
NULL_ISLAND_THRESHOLD = 0.1 # Reject positions within 0.1° of (0,0)
|
||||
|
||||
# -------------------------------------------------------------------------
|
||||
# Angles (radians) - Heading, COG, Wind Direction
|
||||
# -------------------------------------------------------------------------
|
||||
ANGLE_MIN = 0.0
|
||||
ANGLE_MAX = 6.5 # Slightly > 2π (6.283) to allow small errors
|
||||
|
||||
# -------------------------------------------------------------------------
|
||||
# Speed (m/s) - SOG, Wind Speed
|
||||
# -------------------------------------------------------------------------
|
||||
SPEED_MIN = 0.0
|
||||
SPEED_MAX = 100.0 # ~194 knots (theoretical max for any vessel)
|
||||
|
||||
# -------------------------------------------------------------------------
|
||||
# Depth (meters)
|
||||
# -------------------------------------------------------------------------
|
||||
DEPTH_MIN = 0.0
|
||||
DEPTH_MAX = 1000.0 # ~3280 feet / ~546 fathoms
|
||||
|
||||
# -------------------------------------------------------------------------
|
||||
# Temperature (Kelvin)
|
||||
# -------------------------------------------------------------------------
|
||||
AIR_TEMP_MIN = 200.0 # -73°C (extreme cold)
|
||||
AIR_TEMP_MAX = 350.0 # +77°C (extreme heat)
|
||||
WATER_TEMP_MIN = 270.0 # -3°C (near freezing)
|
||||
WATER_TEMP_MAX = 320.0 # +47°C (tropical/engine room)
|
||||
|
||||
# -------------------------------------------------------------------------
|
||||
# Tank Level (percentage)
|
||||
# -------------------------------------------------------------------------
|
||||
TANK_MIN = 0.0
|
||||
TANK_MAX = 100.0
|
||||
|
||||
# -------------------------------------------------------------------------
|
||||
# Battery Voltage
|
||||
# -------------------------------------------------------------------------
|
||||
VOLTAGE_MIN = 10.0 # Below 10V = dead/disconnected
|
||||
VOLTAGE_MAX = 60.0 # Covers 12V, 24V, 48V systems with headroom
|
||||
|
||||
# -------------------------------------------------------------------------
|
||||
# Barometric Pressure (Pascals)
|
||||
# -------------------------------------------------------------------------
|
||||
PRESSURE_MIN = 87000.0 # ~870 mbar (record low: 870 mbar, Typhoon Tip)
|
||||
PRESSURE_MAX = 108400.0 # ~1084 mbar (record high: 1084 mbar, Siberia)
|
||||
|
||||
|
||||
# ==============================================================================
|
||||
# PROTOCOL CONSTANTS
|
||||
# ==============================================================================
|
||||
|
||||
# Tank status values (Field 16.2)
|
||||
TANK_STATUS_WASTE = 5 # Black/gray water tanks transmit status=5
|
||||
516
axiom-nmea/raymarine_nmea/protocol/decoder.py
Normal file
516
axiom-nmea/raymarine_nmea/protocol/decoder.py
Normal file
@@ -0,0 +1,516 @@
|
||||
"""
|
||||
Raymarine packet decoder.
|
||||
|
||||
Decodes Raymarine LightHouse protobuf packets and extracts sensor data
|
||||
into structured Python objects.
|
||||
"""
|
||||
|
||||
from typing import Dict, List, Optional, Any
|
||||
from dataclasses import dataclass, field as dc_field
|
||||
import time
|
||||
|
||||
from .parser import ProtobufParser, ProtoField
|
||||
from .constants import (
|
||||
WIRE_VARINT,
|
||||
WIRE_FIXED64,
|
||||
WIRE_LENGTH,
|
||||
WIRE_FIXED32,
|
||||
HEADER_SIZE,
|
||||
RAD_TO_DEG,
|
||||
MS_TO_KTS,
|
||||
FEET_TO_M,
|
||||
KELVIN_OFFSET,
|
||||
PA_TO_MBAR,
|
||||
Fields,
|
||||
GPSFields,
|
||||
HeadingFields,
|
||||
SOGCOGFields,
|
||||
DepthFields,
|
||||
WindFields,
|
||||
TemperatureFields,
|
||||
TankFields,
|
||||
BatteryFields,
|
||||
EngineFields,
|
||||
ValidationRanges,
|
||||
TANK_STATUS_WASTE,
|
||||
)
|
||||
|
||||
|
||||
@dataclass
|
||||
class DecodedData:
|
||||
"""Container for decoded sensor values from a single packet.
|
||||
|
||||
This is a lightweight container for data extracted from one packet.
|
||||
For aggregated data across multiple packets, use SensorData.
|
||||
"""
|
||||
# Position
|
||||
latitude: Optional[float] = None
|
||||
longitude: Optional[float] = None
|
||||
|
||||
# Navigation
|
||||
heading_deg: Optional[float] = None
|
||||
cog_deg: Optional[float] = None
|
||||
sog_kts: Optional[float] = None
|
||||
|
||||
# Wind
|
||||
twd_deg: Optional[float] = None # True Wind Direction
|
||||
tws_kts: Optional[float] = None # True Wind Speed
|
||||
awa_deg: Optional[float] = None # Apparent Wind Angle
|
||||
aws_kts: Optional[float] = None # Apparent Wind Speed
|
||||
|
||||
# Depth
|
||||
depth_m: Optional[float] = None
|
||||
|
||||
# Temperature
|
||||
water_temp_c: Optional[float] = None
|
||||
air_temp_c: Optional[float] = None
|
||||
|
||||
# Barometric pressure
|
||||
pressure_mbar: Optional[float] = None
|
||||
|
||||
# Tanks: dict of tank_id -> level percentage
|
||||
tanks: Dict[int, float] = dc_field(default_factory=dict)
|
||||
|
||||
# Batteries: dict of battery_id -> voltage
|
||||
batteries: Dict[int, float] = dc_field(default_factory=dict)
|
||||
|
||||
# Decode timestamp
|
||||
timestamp: float = dc_field(default_factory=time.time)
|
||||
|
||||
def has_data(self) -> bool:
|
||||
"""Check if any data was decoded."""
|
||||
return (
|
||||
self.latitude is not None or
|
||||
self.longitude is not None or
|
||||
self.heading_deg is not None or
|
||||
self.cog_deg is not None or
|
||||
self.sog_kts is not None or
|
||||
self.twd_deg is not None or
|
||||
self.tws_kts is not None or
|
||||
self.depth_m is not None or
|
||||
self.water_temp_c is not None or
|
||||
self.air_temp_c is not None or
|
||||
self.pressure_mbar is not None or
|
||||
bool(self.tanks) or
|
||||
bool(self.batteries)
|
||||
)
|
||||
|
||||
|
||||
class RaymarineDecoder:
|
||||
"""Decodes Raymarine packets using proper protobuf parsing.
|
||||
|
||||
This decoder implements field-based parsing of Raymarine's protobuf
|
||||
protocol. It extracts all supported sensor types and validates values.
|
||||
|
||||
Example:
|
||||
decoder = RaymarineDecoder()
|
||||
result = decoder.decode(packet_bytes)
|
||||
if result.latitude:
|
||||
print(f"GPS: {result.latitude}, {result.longitude}")
|
||||
"""
|
||||
|
||||
def __init__(self, verbose: bool = False):
|
||||
"""Initialize the decoder.
|
||||
|
||||
Args:
|
||||
verbose: If True, print decoded values
|
||||
"""
|
||||
self.verbose = verbose
|
||||
|
||||
def decode(self, packet: bytes) -> DecodedData:
|
||||
"""Decode a single Raymarine packet.
|
||||
|
||||
Args:
|
||||
packet: Raw packet bytes (including 20-byte header)
|
||||
|
||||
Returns:
|
||||
DecodedData containing all extracted values
|
||||
"""
|
||||
result = DecodedData()
|
||||
|
||||
# Need at least header + some protobuf data
|
||||
if len(packet) < HEADER_SIZE + 20:
|
||||
return result
|
||||
|
||||
# Skip fixed header, protobuf starts at offset 0x14
|
||||
proto_data = packet[HEADER_SIZE:]
|
||||
|
||||
# Parse protobuf - collect repeated fields:
|
||||
# Field 14 = Engine data (contains battery voltage at 14.3.4)
|
||||
# Field 16 = Tank data
|
||||
# Field 20 = House battery data
|
||||
parser = ProtobufParser(proto_data)
|
||||
fields = parser.parse_message(collect_repeated={14, 16, 20})
|
||||
|
||||
if not fields:
|
||||
return result
|
||||
|
||||
# Extract GPS from Field 2
|
||||
if Fields.GPS_POSITION in fields:
|
||||
gps_field = fields[Fields.GPS_POSITION]
|
||||
if gps_field.children:
|
||||
self._extract_gps(gps_field.children, result)
|
||||
|
||||
# Extract Heading from Field 3
|
||||
if Fields.HEADING in fields:
|
||||
heading_field = fields[Fields.HEADING]
|
||||
if heading_field.children:
|
||||
self._extract_heading(heading_field.children, result)
|
||||
|
||||
# Extract SOG/COG from Field 5 (primary source for SOG/COG)
|
||||
if Fields.SOG_COG in fields:
|
||||
sog_cog_field = fields[Fields.SOG_COG]
|
||||
if sog_cog_field.children:
|
||||
self._extract_sog_cog(sog_cog_field.children, result)
|
||||
|
||||
# Extract Wind from Field 13
|
||||
if Fields.WIND_NAVIGATION in fields:
|
||||
wind_field = fields[Fields.WIND_NAVIGATION]
|
||||
if wind_field.children:
|
||||
self._extract_wind(wind_field.children, result)
|
||||
|
||||
# Extract Depth from Field 7 (only in larger packets)
|
||||
if Fields.DEPTH in fields:
|
||||
depth_field = fields[Fields.DEPTH]
|
||||
if depth_field.children:
|
||||
self._extract_depth(depth_field.children, result)
|
||||
|
||||
# Extract Temperature from Field 15
|
||||
if Fields.TEMPERATURE in fields:
|
||||
temp_field = fields[Fields.TEMPERATURE]
|
||||
if temp_field.children:
|
||||
self._extract_temperature(temp_field.children, result)
|
||||
|
||||
# Extract Tank data from Field 16 (repeated)
|
||||
if Fields.TANK_DATA in fields:
|
||||
tank_fields = fields[Fields.TANK_DATA] # This is a list
|
||||
self._extract_tanks(tank_fields, result)
|
||||
|
||||
# Extract Battery data from Field 20 (repeated) - house batteries
|
||||
if Fields.HOUSE_BATTERY in fields:
|
||||
battery_fields = fields[Fields.HOUSE_BATTERY] # This is a list
|
||||
self._extract_batteries(battery_fields, result)
|
||||
|
||||
# Extract Engine battery data from Field 14 (repeated)
|
||||
if Fields.ENGINE_DATA in fields:
|
||||
engine_fields = fields[Fields.ENGINE_DATA] # This is a list
|
||||
self._extract_engine_batteries(engine_fields, result)
|
||||
|
||||
return result
|
||||
|
||||
def _extract_gps(
|
||||
self,
|
||||
fields: Dict[int, ProtoField],
|
||||
result: DecodedData
|
||||
) -> None:
|
||||
"""Extract GPS position from Field 2's children."""
|
||||
lat = None
|
||||
lon = None
|
||||
|
||||
# Field 1 = Latitude
|
||||
if GPSFields.LATITUDE in fields:
|
||||
f = fields[GPSFields.LATITUDE]
|
||||
if f.wire_type == WIRE_FIXED64:
|
||||
lat = ProtobufParser.decode_double(f.value)
|
||||
|
||||
# Field 2 = Longitude
|
||||
if GPSFields.LONGITUDE in fields:
|
||||
f = fields[GPSFields.LONGITUDE]
|
||||
if f.wire_type == WIRE_FIXED64:
|
||||
lon = ProtobufParser.decode_double(f.value)
|
||||
|
||||
# Validate lat/lon
|
||||
if lat is not None and lon is not None:
|
||||
if (ValidationRanges.LATITUDE_MIN <= lat <= ValidationRanges.LATITUDE_MAX and
|
||||
ValidationRanges.LONGITUDE_MIN <= lon <= ValidationRanges.LONGITUDE_MAX):
|
||||
# Check not at null island
|
||||
if (abs(lat) > ValidationRanges.NULL_ISLAND_THRESHOLD or
|
||||
abs(lon) > ValidationRanges.NULL_ISLAND_THRESHOLD):
|
||||
result.latitude = lat
|
||||
result.longitude = lon
|
||||
if self.verbose:
|
||||
print(f"GPS: {lat:.6f}, {lon:.6f}")
|
||||
|
||||
def _extract_heading(
|
||||
self,
|
||||
fields: Dict[int, ProtoField],
|
||||
result: DecodedData
|
||||
) -> None:
|
||||
"""Extract heading from Field 3's children."""
|
||||
# Field 2 = Heading in radians
|
||||
if HeadingFields.HEADING_RAD in fields:
|
||||
f = fields[HeadingFields.HEADING_RAD]
|
||||
if f.wire_type == WIRE_FIXED32:
|
||||
val = ProtobufParser.decode_float(f.value)
|
||||
if val is not None:
|
||||
if ValidationRanges.ANGLE_MIN <= val <= ValidationRanges.ANGLE_MAX:
|
||||
heading_deg = (val * RAD_TO_DEG) % 360
|
||||
result.heading_deg = heading_deg
|
||||
if self.verbose:
|
||||
print(f"Heading: {heading_deg:.1f}°")
|
||||
|
||||
def _extract_sog_cog(
|
||||
self,
|
||||
fields: Dict[int, ProtoField],
|
||||
result: DecodedData
|
||||
) -> None:
|
||||
"""Extract SOG and COG from Field 5's children.
|
||||
|
||||
Field 5 contains GPS-derived navigation data.
|
||||
Field 5.1 = COG (shows most variation in real data)
|
||||
Field 5.3 = SOG (confirmed with real data)
|
||||
"""
|
||||
# Field 5.1 = COG (Course Over Ground) in radians - confirmed with real data
|
||||
if SOGCOGFields.COG_RAD in fields:
|
||||
f = fields[SOGCOGFields.COG_RAD]
|
||||
if f.wire_type == WIRE_FIXED32:
|
||||
val = ProtobufParser.decode_float(f.value)
|
||||
if val is not None:
|
||||
if ValidationRanges.ANGLE_MIN <= val <= ValidationRanges.ANGLE_MAX:
|
||||
cog_deg = (val * RAD_TO_DEG) % 360
|
||||
result.cog_deg = cog_deg
|
||||
if self.verbose:
|
||||
print(f"COG: {cog_deg:.1f}°")
|
||||
|
||||
# Field 5.3 = SOG (Speed Over Ground) in m/s - confirmed with real data
|
||||
if SOGCOGFields.SOG_MS in fields:
|
||||
f = fields[SOGCOGFields.SOG_MS]
|
||||
if f.wire_type == WIRE_FIXED32:
|
||||
val = ProtobufParser.decode_float(f.value)
|
||||
if val is not None:
|
||||
if ValidationRanges.SPEED_MIN <= val <= ValidationRanges.SPEED_MAX:
|
||||
sog_kts = val * MS_TO_KTS
|
||||
result.sog_kts = sog_kts
|
||||
if self.verbose:
|
||||
print(f"SOG: {sog_kts:.1f} kts")
|
||||
|
||||
def _extract_wind(
|
||||
self,
|
||||
fields: Dict[int, ProtoField],
|
||||
result: DecodedData
|
||||
) -> None:
|
||||
"""Extract wind data from Field 13's children."""
|
||||
# Field 4 = True Wind Direction (radians)
|
||||
if WindFields.TRUE_WIND_DIRECTION in fields:
|
||||
f = fields[WindFields.TRUE_WIND_DIRECTION]
|
||||
if f.wire_type == WIRE_FIXED32:
|
||||
val = ProtobufParser.decode_float(f.value)
|
||||
if val is not None:
|
||||
if ValidationRanges.ANGLE_MIN <= val <= ValidationRanges.ANGLE_MAX:
|
||||
twd_deg = (val * RAD_TO_DEG) % 360
|
||||
result.twd_deg = twd_deg
|
||||
if self.verbose:
|
||||
print(f"TWD: {twd_deg:.1f}°")
|
||||
|
||||
# Field 5 = True Wind Speed (m/s)
|
||||
if WindFields.TRUE_WIND_SPEED in fields:
|
||||
f = fields[WindFields.TRUE_WIND_SPEED]
|
||||
if f.wire_type == WIRE_FIXED32:
|
||||
val = ProtobufParser.decode_float(f.value)
|
||||
if val is not None:
|
||||
if ValidationRanges.SPEED_MIN <= val <= ValidationRanges.SPEED_MAX:
|
||||
tws_kts = val * MS_TO_KTS
|
||||
result.tws_kts = tws_kts
|
||||
if self.verbose:
|
||||
print(f"TWS: {tws_kts:.1f} kts")
|
||||
|
||||
# Field 6 = Apparent Wind Speed (m/s)
|
||||
if WindFields.APPARENT_WIND_SPEED in fields:
|
||||
f = fields[WindFields.APPARENT_WIND_SPEED]
|
||||
if f.wire_type == WIRE_FIXED32:
|
||||
val = ProtobufParser.decode_float(f.value)
|
||||
if val is not None:
|
||||
if ValidationRanges.SPEED_MIN <= val <= ValidationRanges.SPEED_MAX:
|
||||
aws_kts = val * MS_TO_KTS
|
||||
result.aws_kts = aws_kts
|
||||
if self.verbose:
|
||||
print(f"AWS: {aws_kts:.1f} kts")
|
||||
|
||||
def _extract_depth(
|
||||
self,
|
||||
fields: Dict[int, ProtoField],
|
||||
result: DecodedData
|
||||
) -> None:
|
||||
"""Extract depth from Field 7's children."""
|
||||
if DepthFields.DEPTH_METERS in fields:
|
||||
f = fields[DepthFields.DEPTH_METERS]
|
||||
if f.wire_type == WIRE_FIXED32:
|
||||
val = ProtobufParser.decode_float(f.value)
|
||||
if val is not None:
|
||||
if ValidationRanges.DEPTH_MIN < val <= ValidationRanges.DEPTH_MAX:
|
||||
result.depth_m = val
|
||||
if self.verbose:
|
||||
depth_ft = val / FEET_TO_M
|
||||
print(f"Depth: {depth_ft:.1f} ft ({val:.2f} m)")
|
||||
|
||||
def _extract_temperature(
|
||||
self,
|
||||
fields: Dict[int, ProtoField],
|
||||
result: DecodedData
|
||||
) -> None:
|
||||
"""Extract temperature and pressure from Field 15's children."""
|
||||
# Field 1 = Barometric Pressure (Pascals)
|
||||
if TemperatureFields.BAROMETRIC_PRESSURE in fields:
|
||||
f = fields[TemperatureFields.BAROMETRIC_PRESSURE]
|
||||
if f.wire_type == WIRE_FIXED32:
|
||||
val = ProtobufParser.decode_float(f.value)
|
||||
if val is not None:
|
||||
if ValidationRanges.PRESSURE_MIN <= val <= ValidationRanges.PRESSURE_MAX:
|
||||
pressure_mbar = val * PA_TO_MBAR
|
||||
result.pressure_mbar = pressure_mbar
|
||||
if self.verbose:
|
||||
print(f"Pressure: {pressure_mbar:.1f} mbar")
|
||||
|
||||
# Field 3 = Air Temperature (Kelvin)
|
||||
if TemperatureFields.AIR_TEMP in fields:
|
||||
f = fields[TemperatureFields.AIR_TEMP]
|
||||
if f.wire_type == WIRE_FIXED32:
|
||||
val = ProtobufParser.decode_float(f.value)
|
||||
if val is not None:
|
||||
if ValidationRanges.AIR_TEMP_MIN <= val <= ValidationRanges.AIR_TEMP_MAX:
|
||||
temp_c = val - KELVIN_OFFSET
|
||||
result.air_temp_c = temp_c
|
||||
if self.verbose:
|
||||
print(f"Air Temp: {temp_c:.1f}°C")
|
||||
|
||||
# Field 9 = Water Temperature (Kelvin)
|
||||
if TemperatureFields.WATER_TEMP in fields:
|
||||
f = fields[TemperatureFields.WATER_TEMP]
|
||||
if f.wire_type == WIRE_FIXED32:
|
||||
val = ProtobufParser.decode_float(f.value)
|
||||
if val is not None:
|
||||
if ValidationRanges.WATER_TEMP_MIN <= val <= ValidationRanges.WATER_TEMP_MAX:
|
||||
temp_c = val - KELVIN_OFFSET
|
||||
result.water_temp_c = temp_c
|
||||
if self.verbose:
|
||||
print(f"Water Temp: {temp_c:.1f}°C")
|
||||
|
||||
def _extract_tanks(
|
||||
self,
|
||||
tank_fields: List[ProtoField],
|
||||
result: DecodedData
|
||||
) -> None:
|
||||
"""Extract tank levels from Field 16 (repeated)."""
|
||||
for tank_field in tank_fields:
|
||||
if not tank_field.children:
|
||||
continue
|
||||
|
||||
children = tank_field.children
|
||||
tank_id = None
|
||||
level = None
|
||||
status = None
|
||||
|
||||
# Field 1 = Tank ID (varint)
|
||||
if TankFields.TANK_ID in children:
|
||||
f = children[TankFields.TANK_ID]
|
||||
if f.wire_type == WIRE_VARINT:
|
||||
tank_id = f.value
|
||||
|
||||
# Field 2 = Status (varint)
|
||||
if TankFields.STATUS in children:
|
||||
f = children[TankFields.STATUS]
|
||||
if f.wire_type == WIRE_VARINT:
|
||||
status = f.value
|
||||
|
||||
# Field 3 = Tank Level percentage (float)
|
||||
if TankFields.LEVEL_PCT in children:
|
||||
f = children[TankFields.LEVEL_PCT]
|
||||
if f.wire_type == WIRE_FIXED32:
|
||||
val = ProtobufParser.decode_float(f.value)
|
||||
if val is not None:
|
||||
if ValidationRanges.TANK_MIN <= val <= ValidationRanges.TANK_MAX:
|
||||
level = val
|
||||
|
||||
# If we have a level but no tank_id, try to infer it
|
||||
if tank_id is None and level is not None:
|
||||
if status == TANK_STATUS_WASTE:
|
||||
# Black/gray water tank
|
||||
tank_id = 100
|
||||
elif status is None:
|
||||
# Port Fuel is the ONLY tank with neither ID nor status
|
||||
tank_id = 2
|
||||
|
||||
if tank_id is not None and level is not None:
|
||||
result.tanks[tank_id] = level
|
||||
if self.verbose:
|
||||
print(f"Tank {tank_id}: {level:.1f}%")
|
||||
|
||||
def _extract_batteries(
|
||||
self,
|
||||
battery_fields: List[ProtoField],
|
||||
result: DecodedData
|
||||
) -> None:
|
||||
"""Extract battery voltages from Field 20 (repeated)."""
|
||||
for battery_field in battery_fields:
|
||||
if not battery_field.children:
|
||||
continue
|
||||
|
||||
children = battery_field.children
|
||||
battery_id = None
|
||||
voltage = None
|
||||
|
||||
# Field 1 = Battery ID (varint)
|
||||
if BatteryFields.BATTERY_ID in children:
|
||||
f = children[BatteryFields.BATTERY_ID]
|
||||
if f.wire_type == WIRE_VARINT:
|
||||
battery_id = f.value
|
||||
|
||||
# Field 3 = Voltage (float)
|
||||
if BatteryFields.VOLTAGE in children:
|
||||
f = children[BatteryFields.VOLTAGE]
|
||||
if f.wire_type == WIRE_FIXED32:
|
||||
val = ProtobufParser.decode_float(f.value)
|
||||
if val is not None:
|
||||
if ValidationRanges.VOLTAGE_MIN <= val <= ValidationRanges.VOLTAGE_MAX:
|
||||
voltage = val
|
||||
|
||||
if battery_id is not None and voltage is not None:
|
||||
result.batteries[battery_id] = voltage
|
||||
if self.verbose:
|
||||
print(f"Battery {battery_id}: {voltage:.2f}V")
|
||||
|
||||
def _extract_engine_batteries(
|
||||
self,
|
||||
engine_fields: List[ProtoField],
|
||||
result: DecodedData
|
||||
) -> None:
|
||||
"""Extract engine battery voltages from Field 14 (repeated).
|
||||
|
||||
Engine data structure:
|
||||
Field 14.1 (varint): Engine ID (0=Port, 1=Starboard)
|
||||
Field 14.3 (message): Engine sensor data
|
||||
Field 14.3.4 (float): Battery voltage
|
||||
"""
|
||||
for engine_field in engine_fields:
|
||||
if not engine_field.children:
|
||||
continue
|
||||
|
||||
children = engine_field.children
|
||||
|
||||
# Field 1 = Engine ID (varint), default 0 (Port) if not present
|
||||
engine_id = 0
|
||||
if EngineFields.ENGINE_ID in children:
|
||||
f = children[EngineFields.ENGINE_ID]
|
||||
if f.wire_type == WIRE_VARINT:
|
||||
engine_id = f.value
|
||||
|
||||
# Field 3 = Engine sensor data (nested message)
|
||||
if EngineFields.SENSOR_DATA in children:
|
||||
f = children[EngineFields.SENSOR_DATA]
|
||||
if f.wire_type == WIRE_LENGTH:
|
||||
sensor_data = f.value
|
||||
# Parse the nested message to get Field 4 (voltage)
|
||||
sensor_parser = ProtobufParser(sensor_data)
|
||||
sensor_fields = sensor_parser.parse_message()
|
||||
|
||||
if EngineFields.BATTERY_VOLTAGE in sensor_fields:
|
||||
vf = sensor_fields[EngineFields.BATTERY_VOLTAGE]
|
||||
if vf.wire_type == WIRE_FIXED32:
|
||||
val = ProtobufParser.decode_float(vf.value)
|
||||
if val is not None:
|
||||
if ValidationRanges.VOLTAGE_MIN <= val <= ValidationRanges.VOLTAGE_MAX:
|
||||
# Use battery_id = 1000 + engine_id
|
||||
battery_id = 1000 + engine_id
|
||||
result.batteries[battery_id] = val
|
||||
if self.verbose:
|
||||
print(f"Engine {engine_id} Battery: {val:.2f}V")
|
||||
243
axiom-nmea/raymarine_nmea/protocol/parser.py
Normal file
243
axiom-nmea/raymarine_nmea/protocol/parser.py
Normal file
@@ -0,0 +1,243 @@
|
||||
"""
|
||||
Protobuf wire format parser for Raymarine packets.
|
||||
|
||||
This parser implements the Google Protocol Buffers wire format without
|
||||
requiring a schema (.proto file). It can parse any protobuf message and
|
||||
return the field structure.
|
||||
|
||||
Wire Format Reference:
|
||||
https://developers.google.com/protocol-buffers/docs/encoding
|
||||
"""
|
||||
|
||||
import struct
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Any, Dict, Optional, Set
|
||||
|
||||
from .constants import (
|
||||
WIRE_VARINT,
|
||||
WIRE_FIXED64,
|
||||
WIRE_LENGTH,
|
||||
WIRE_FIXED32,
|
||||
)
|
||||
|
||||
|
||||
@dataclass
|
||||
class ProtoField:
|
||||
"""A decoded protobuf field.
|
||||
|
||||
Attributes:
|
||||
field_num: The field number (1-536870911)
|
||||
wire_type: The wire type (0, 1, 2, or 5)
|
||||
value: The decoded value (int, bytes, or float)
|
||||
children: For length-delimited fields, nested message fields
|
||||
"""
|
||||
field_num: int
|
||||
wire_type: int
|
||||
value: Any
|
||||
children: Dict[int, 'ProtoField'] = field(default_factory=dict)
|
||||
|
||||
|
||||
class ProtobufParser:
|
||||
"""Parses protobuf wire format without a schema.
|
||||
|
||||
This parser reads raw protobuf data and extracts fields based on their
|
||||
wire type. For length-delimited fields, it attempts to parse them as
|
||||
nested messages.
|
||||
|
||||
Example:
|
||||
parser = ProtobufParser(packet_bytes)
|
||||
fields = parser.parse_message(collect_repeated={14, 16, 20})
|
||||
if 2 in fields:
|
||||
gps_field = fields[2]
|
||||
if gps_field.children:
|
||||
# Access nested GPS fields
|
||||
lat_field = gps_field.children.get(1)
|
||||
"""
|
||||
|
||||
def __init__(self, data: bytes):
|
||||
"""Initialize parser with protobuf data.
|
||||
|
||||
Args:
|
||||
data: Raw protobuf bytes to parse
|
||||
"""
|
||||
self.data = data
|
||||
self.pos = 0
|
||||
|
||||
def remaining(self) -> int:
|
||||
"""Return number of unread bytes."""
|
||||
return len(self.data) - self.pos
|
||||
|
||||
def read_varint(self) -> int:
|
||||
"""Decode a variable-length integer.
|
||||
|
||||
Varints use 7 bits per byte for the value, with the high bit
|
||||
indicating whether more bytes follow.
|
||||
|
||||
Returns:
|
||||
The decoded integer value
|
||||
|
||||
Raises:
|
||||
IndexError: If data ends before varint is complete
|
||||
"""
|
||||
result = 0
|
||||
shift = 0
|
||||
while self.pos < len(self.data):
|
||||
byte = self.data[self.pos]
|
||||
self.pos += 1
|
||||
result |= (byte & 0x7F) << shift
|
||||
if not (byte & 0x80):
|
||||
break
|
||||
shift += 7
|
||||
if shift > 63:
|
||||
break
|
||||
return result
|
||||
|
||||
def read_fixed64(self) -> bytes:
|
||||
"""Read 8 bytes (fixed64 wire type).
|
||||
|
||||
Returns:
|
||||
8 bytes of raw data
|
||||
"""
|
||||
value = self.data[self.pos:self.pos + 8]
|
||||
self.pos += 8
|
||||
return value
|
||||
|
||||
def read_fixed32(self) -> bytes:
|
||||
"""Read 4 bytes (fixed32 wire type).
|
||||
|
||||
Returns:
|
||||
4 bytes of raw data
|
||||
"""
|
||||
value = self.data[self.pos:self.pos + 4]
|
||||
self.pos += 4
|
||||
return value
|
||||
|
||||
def read_length_delimited(self) -> bytes:
|
||||
"""Read length-prefixed data.
|
||||
|
||||
First reads a varint for the length, then reads that many bytes.
|
||||
|
||||
Returns:
|
||||
The length-delimited data
|
||||
"""
|
||||
length = self.read_varint()
|
||||
value = self.data[self.pos:self.pos + length]
|
||||
self.pos += length
|
||||
return value
|
||||
|
||||
def parse_message(
|
||||
self,
|
||||
collect_repeated: Optional[Set[int]] = None
|
||||
) -> Dict[int, Any]:
|
||||
"""Parse all fields in a protobuf message.
|
||||
|
||||
Args:
|
||||
collect_repeated: Set of field numbers to collect as lists.
|
||||
Use this for repeated fields like tanks, batteries.
|
||||
|
||||
Returns:
|
||||
Dictionary mapping field numbers to ProtoField objects.
|
||||
For repeated fields (in collect_repeated), maps to list of ProtoField.
|
||||
"""
|
||||
fields: Dict[int, Any] = {}
|
||||
if collect_repeated is None:
|
||||
collect_repeated = set()
|
||||
|
||||
while self.pos < len(self.data):
|
||||
if self.remaining() < 1:
|
||||
break
|
||||
|
||||
try:
|
||||
# Read tag: (field_number << 3) | wire_type
|
||||
tag = self.read_varint()
|
||||
field_num = tag >> 3
|
||||
wire_type = tag & 0x07
|
||||
|
||||
# Validate field number
|
||||
if field_num == 0 or field_num > 536870911:
|
||||
break
|
||||
|
||||
# Read value based on wire type
|
||||
if wire_type == WIRE_VARINT:
|
||||
value = self.read_varint()
|
||||
elif wire_type == WIRE_FIXED64:
|
||||
if self.remaining() < 8:
|
||||
break
|
||||
value = self.read_fixed64()
|
||||
elif wire_type == WIRE_LENGTH:
|
||||
value = self.read_length_delimited()
|
||||
elif wire_type == WIRE_FIXED32:
|
||||
if self.remaining() < 4:
|
||||
break
|
||||
value = self.read_fixed32()
|
||||
else:
|
||||
# Unknown wire type (3, 4 are deprecated)
|
||||
break
|
||||
|
||||
# For length-delimited, try to parse as nested message
|
||||
children: Dict[int, ProtoField] = {}
|
||||
if wire_type == WIRE_LENGTH and len(value) >= 2:
|
||||
try:
|
||||
nested_parser = ProtobufParser(value)
|
||||
children = nested_parser.parse_message()
|
||||
# Only keep if we parsed most of the data
|
||||
if nested_parser.pos < len(value) * 0.5:
|
||||
children = {}
|
||||
except Exception:
|
||||
children = {}
|
||||
|
||||
pf = ProtoField(field_num, wire_type, value, children)
|
||||
|
||||
# Handle repeated fields - collect as list
|
||||
if field_num in collect_repeated:
|
||||
if field_num not in fields:
|
||||
fields[field_num] = []
|
||||
fields[field_num].append(pf)
|
||||
else:
|
||||
# Keep last occurrence for non-repeated fields
|
||||
fields[field_num] = pf
|
||||
|
||||
except (IndexError, struct.error):
|
||||
break
|
||||
|
||||
return fields
|
||||
|
||||
@staticmethod
|
||||
def decode_double(raw: bytes) -> Optional[float]:
|
||||
"""Decode 8 bytes as little-endian double.
|
||||
|
||||
Args:
|
||||
raw: 8 bytes of raw data
|
||||
|
||||
Returns:
|
||||
Decoded float value, or None if invalid/NaN
|
||||
"""
|
||||
if len(raw) != 8:
|
||||
return None
|
||||
try:
|
||||
val = struct.unpack('<d', raw)[0]
|
||||
if val != val: # NaN check
|
||||
return None
|
||||
return val
|
||||
except struct.error:
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def decode_float(raw: bytes) -> Optional[float]:
|
||||
"""Decode 4 bytes as little-endian float.
|
||||
|
||||
Args:
|
||||
raw: 4 bytes of raw data
|
||||
|
||||
Returns:
|
||||
Decoded float value, or None if invalid/NaN
|
||||
"""
|
||||
if len(raw) != 4:
|
||||
return None
|
||||
try:
|
||||
val = struct.unpack('<f', raw)[0]
|
||||
if val != val: # NaN check
|
||||
return None
|
||||
return val
|
||||
except struct.error:
|
||||
return None
|
||||
135
axiom-nmea/raymarine_nmea/sensors/__init__.py
Normal file
135
axiom-nmea/raymarine_nmea/sensors/__init__.py
Normal file
@@ -0,0 +1,135 @@
|
||||
"""
|
||||
Sensor configuration and metadata.
|
||||
|
||||
This module provides configuration for tanks, batteries, and network settings
|
||||
specific to the vessel. These can be customized for different installations.
|
||||
"""
|
||||
|
||||
from dataclasses import dataclass
|
||||
from typing import Dict, Tuple, Optional
|
||||
|
||||
|
||||
@dataclass
|
||||
class TankInfo:
|
||||
"""Configuration for a tank sensor."""
|
||||
name: str
|
||||
capacity_gallons: Optional[float] = None
|
||||
tank_type: str = "fuel" # fuel, water, waste, other
|
||||
|
||||
@property
|
||||
def capacity_liters(self) -> Optional[float]:
|
||||
"""Return capacity in liters."""
|
||||
if self.capacity_gallons is None:
|
||||
return None
|
||||
return self.capacity_gallons * 3.78541
|
||||
|
||||
|
||||
@dataclass
|
||||
class BatteryInfo:
|
||||
"""Configuration for a battery sensor."""
|
||||
name: str
|
||||
nominal_voltage: float = 12.0 # 12V, 24V, 48V
|
||||
battery_type: str = "house" # house, engine, starter
|
||||
|
||||
|
||||
# Default tank configuration
|
||||
# Key is the tank ID from Raymarine
|
||||
TANK_CONFIG: Dict[int, TankInfo] = {
|
||||
1: TankInfo("Fuel Starboard", 265, "fuel"),
|
||||
2: TankInfo("Fuel Port", 265, "fuel"),
|
||||
10: TankInfo("Water Bow", 90, "water"),
|
||||
11: TankInfo("Water Stern", 90, "water"),
|
||||
100: TankInfo("Black Water", 53, "blackwater"),
|
||||
}
|
||||
|
||||
# Default battery configuration
|
||||
# Key is the battery ID from Raymarine
|
||||
# IDs 1000+ are engine batteries (1000 + engine_id)
|
||||
BATTERY_CONFIG: Dict[int, BatteryInfo] = {
|
||||
11: BatteryInfo("House Battery Bow", 24.0, "house"),
|
||||
13: BatteryInfo("House Battery Stern", 24.0, "house"),
|
||||
1000: BatteryInfo("Engine Port", 24.0, "engine"),
|
||||
1001: BatteryInfo("Engine Starboard", 24.0, "engine"),
|
||||
}
|
||||
|
||||
# Raymarine multicast groups
|
||||
# Each tuple is (group_address, port)
|
||||
#
|
||||
# OPTIMIZATION: Based on testing, ALL sensor data (GPS, wind, depth, heading,
|
||||
# temperature, tanks, batteries) comes from the primary group. The other groups
|
||||
# contain only heartbeats, display sync, or zero/empty data.
|
||||
#
|
||||
# Test results showed:
|
||||
# 226.192.206.102:2565 - 520 pkts, 297 decoded (GPS, heading, wind, depth, temp, tanks, batteries)
|
||||
# 226.192.219.0:3221 - 2707 pkts, 0 decoded (display sync - high traffic, no data!)
|
||||
# 226.192.206.99:2562 - 402 pkts, 0 decoded (heartbeat only)
|
||||
# 226.192.206.98:2561 - 356 pkts, 0 decoded (mostly zeros)
|
||||
# Others - <30 pkts each, 0 decoded
|
||||
#
|
||||
# Using only the primary group reduces:
|
||||
# - Thread count: 7 → 1 (less context switching)
|
||||
# - Packet processing: ~4000 → ~500 packets (87% reduction)
|
||||
# - CPU usage: significant reduction on embedded devices
|
||||
|
||||
# Primary multicast group - contains ALL sensor data
|
||||
MULTICAST_GROUPS: list[Tuple[str, int]] = [
|
||||
("226.192.206.102", 2565), # PRIMARY sensor data (GPS, wind, depth, tanks, batteries, etc.)
|
||||
]
|
||||
|
||||
# Legacy: All groups (for debugging/testing only)
|
||||
MULTICAST_GROUPS_ALL: list[Tuple[str, int]] = [
|
||||
("226.192.206.98", 2561), # Navigation sensors (mostly zeros)
|
||||
("226.192.206.99", 2562), # Heartbeat/status
|
||||
("226.192.206.100", 2563), # Alternative data (low traffic)
|
||||
("226.192.206.101", 2564), # Alternative data (low traffic)
|
||||
("226.192.206.102", 2565), # PRIMARY sensor data
|
||||
("226.192.219.0", 3221), # Display synchronization (HIGH traffic, no sensor data!)
|
||||
("239.2.1.1", 2154), # Tank/engine data (not used on this vessel)
|
||||
]
|
||||
|
||||
# Primary multicast group for most sensor data
|
||||
PRIMARY_MULTICAST_GROUP = ("226.192.206.102", 2565)
|
||||
|
||||
|
||||
def get_tank_name(tank_id: int) -> str:
|
||||
"""Get the display name for a tank ID."""
|
||||
if tank_id in TANK_CONFIG:
|
||||
return TANK_CONFIG[tank_id].name
|
||||
return f"Tank #{tank_id}"
|
||||
|
||||
|
||||
def get_tank_capacity(tank_id: int) -> Optional[float]:
|
||||
"""Get the capacity in gallons for a tank ID."""
|
||||
if tank_id in TANK_CONFIG:
|
||||
return TANK_CONFIG[tank_id].capacity_gallons
|
||||
return None
|
||||
|
||||
|
||||
def get_battery_name(battery_id: int) -> str:
|
||||
"""Get the display name for a battery ID."""
|
||||
if battery_id in BATTERY_CONFIG:
|
||||
return BATTERY_CONFIG[battery_id].name
|
||||
return f"Battery #{battery_id}"
|
||||
|
||||
|
||||
def get_battery_nominal_voltage(battery_id: int) -> float:
|
||||
"""Get the nominal voltage for a battery ID."""
|
||||
if battery_id in BATTERY_CONFIG:
|
||||
return BATTERY_CONFIG[battery_id].nominal_voltage
|
||||
# Default to 12V for unknown batteries
|
||||
return 12.0
|
||||
|
||||
|
||||
__all__ = [
|
||||
"TankInfo",
|
||||
"BatteryInfo",
|
||||
"TANK_CONFIG",
|
||||
"BATTERY_CONFIG",
|
||||
"MULTICAST_GROUPS",
|
||||
"MULTICAST_GROUPS_ALL",
|
||||
"PRIMARY_MULTICAST_GROUP",
|
||||
"get_tank_name",
|
||||
"get_tank_capacity",
|
||||
"get_battery_name",
|
||||
"get_battery_nominal_voltage",
|
||||
]
|
||||
51
axiom-nmea/raymarine_nmea/venus_dbus/__init__.py
Normal file
51
axiom-nmea/raymarine_nmea/venus_dbus/__init__.py
Normal file
@@ -0,0 +1,51 @@
|
||||
"""
|
||||
Venus OS D-Bus Publisher Module.
|
||||
|
||||
This module provides D-Bus services for publishing Raymarine sensor data
|
||||
to Venus OS, making it available to the Victron ecosystem.
|
||||
|
||||
Supported services:
|
||||
- GPS: Position, speed, course, altitude
|
||||
- Meteo: Wind direction and speed
|
||||
- Navigation: Heading, depth, water temperature
|
||||
- Tank: Tank levels for fuel, water, waste
|
||||
- Battery: Battery voltage and state of charge
|
||||
|
||||
Example usage:
|
||||
from raymarine_nmea import SensorData, RaymarineDecoder, MulticastListener
|
||||
from raymarine_nmea.venus_dbus import VenusPublisher
|
||||
|
||||
# Create sensor data store
|
||||
sensor_data = SensorData()
|
||||
decoder = RaymarineDecoder()
|
||||
|
||||
# Start multicast listener
|
||||
listener = MulticastListener(
|
||||
decoder=decoder,
|
||||
sensor_data=sensor_data,
|
||||
interface_ip="198.18.5.5",
|
||||
)
|
||||
listener.start()
|
||||
|
||||
# Start Venus OS publisher
|
||||
publisher = VenusPublisher(sensor_data)
|
||||
publisher.start() # Blocks in GLib main loop
|
||||
"""
|
||||
|
||||
from .service import VeDbusServiceBase
|
||||
from .gps import GpsService
|
||||
from .meteo import MeteoService
|
||||
from .navigation import NavigationService
|
||||
from .tank import TankService
|
||||
from .battery import BatteryService
|
||||
from .publisher import VenusPublisher
|
||||
|
||||
__all__ = [
|
||||
"VeDbusServiceBase",
|
||||
"GpsService",
|
||||
"MeteoService",
|
||||
"NavigationService",
|
||||
"TankService",
|
||||
"BatteryService",
|
||||
"VenusPublisher",
|
||||
]
|
||||
382
axiom-nmea/raymarine_nmea/venus_dbus/battery.py
Normal file
382
axiom-nmea/raymarine_nmea/venus_dbus/battery.py
Normal file
@@ -0,0 +1,382 @@
|
||||
"""
|
||||
Battery D-Bus service for Venus OS.
|
||||
|
||||
Publishes battery data to the Venus OS D-Bus using the
|
||||
com.victronenergy.battery service type.
|
||||
|
||||
Each physical battery requires a separate D-Bus service instance.
|
||||
|
||||
D-Bus paths:
|
||||
/Dc/0/Voltage - Battery voltage in V DC
|
||||
/Dc/0/Current - Not available from Raymarine
|
||||
/Dc/0/Power - Not available from Raymarine
|
||||
/Dc/0/Temperature - Not available from Raymarine
|
||||
/Soc - Estimated from voltage for 24V AGM batteries
|
||||
/Connected - Connection status
|
||||
"""
|
||||
|
||||
import logging
|
||||
import time
|
||||
from typing import Any, Dict, Optional, List, Tuple
|
||||
|
||||
from .service import VeDbusServiceBase
|
||||
from ..data.store import SensorData
|
||||
from ..sensors import BATTERY_CONFIG, BatteryInfo, get_battery_name, get_battery_nominal_voltage
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Alert thresholds
|
||||
LOW_VOLTAGE_ALERT_THRESHOLD = 23.0 # Volts - alert if below this
|
||||
LOW_VOLTAGE_WARNING_DELAY = 60.0 # Seconds - warning (level 1) after this duration
|
||||
LOW_VOLTAGE_ALARM_DELAY = 300.0 # Seconds - alarm (level 2) after this duration
|
||||
|
||||
# High voltage alert thresholds
|
||||
HIGH_VOLTAGE_ALERT_THRESHOLD = 30.2 # Volts - alert if above this (above normal absorption)
|
||||
HIGH_VOLTAGE_WARNING_DELAY = 60.0 # Seconds - warning (level 1) after this duration
|
||||
HIGH_VOLTAGE_ALARM_DELAY = 300.0 # Seconds - alarm (level 2) after this duration
|
||||
|
||||
# 24V AGM battery voltage to SOC lookup table
|
||||
# Based on resting voltage (no load, no charge) for AGM batteries
|
||||
# Format: (voltage, soc_percent)
|
||||
# 24V = 2x 12V batteries in series
|
||||
AGM_24V_SOC_TABLE: List[Tuple[float, int]] = [
|
||||
(25.50, 100), # 12.75V per battery - fully charged
|
||||
(25.30, 95),
|
||||
(25.10, 90),
|
||||
(24.90, 85),
|
||||
(24.70, 80),
|
||||
(24.50, 75),
|
||||
(24.30, 70),
|
||||
(24.10, 65),
|
||||
(23.90, 60),
|
||||
(23.70, 55),
|
||||
(23.50, 50),
|
||||
(23.30, 45),
|
||||
(23.10, 40),
|
||||
(22.90, 35),
|
||||
(22.70, 30),
|
||||
(22.50, 25),
|
||||
(22.30, 20),
|
||||
(22.10, 15),
|
||||
(21.90, 10),
|
||||
(21.70, 5),
|
||||
(21.50, 0), # 10.75V per battery - fully discharged
|
||||
]
|
||||
|
||||
|
||||
def estimate_soc_24v_agm(voltage: float) -> int:
|
||||
"""Estimate state of charge for a 24V AGM battery based on voltage.
|
||||
|
||||
This is an approximation based on resting voltage. Actual SOC can vary
|
||||
based on load, temperature, and battery age.
|
||||
|
||||
Args:
|
||||
voltage: Battery voltage in volts
|
||||
|
||||
Returns:
|
||||
Estimated SOC as percentage (0-100)
|
||||
"""
|
||||
if voltage >= AGM_24V_SOC_TABLE[0][0]:
|
||||
return 100
|
||||
if voltage <= AGM_24V_SOC_TABLE[-1][0]:
|
||||
return 0
|
||||
|
||||
# Find the two points to interpolate between
|
||||
for i in range(len(AGM_24V_SOC_TABLE) - 1):
|
||||
v_high, soc_high = AGM_24V_SOC_TABLE[i]
|
||||
v_low, soc_low = AGM_24V_SOC_TABLE[i + 1]
|
||||
|
||||
if v_low <= voltage <= v_high:
|
||||
# Linear interpolation
|
||||
ratio = (voltage - v_low) / (v_high - v_low)
|
||||
soc = soc_low + ratio * (soc_high - soc_low)
|
||||
return int(round(soc))
|
||||
|
||||
return 50 # Fallback
|
||||
|
||||
|
||||
class BatteryService(VeDbusServiceBase):
|
||||
"""Battery D-Bus service for Venus OS.
|
||||
|
||||
Publishes a single battery's voltage data to the Venus OS D-Bus.
|
||||
Create one instance per physical battery.
|
||||
|
||||
Note: Raymarine only provides voltage readings. Current, power,
|
||||
SOC, and other advanced metrics require a dedicated battery monitor
|
||||
like a Victron BMV or SmartShunt.
|
||||
|
||||
Example:
|
||||
sensor_data = SensorData()
|
||||
|
||||
# Create service for battery ID 11
|
||||
battery_service = BatteryService(
|
||||
sensor_data=sensor_data,
|
||||
battery_id=11,
|
||||
device_instance=0,
|
||||
)
|
||||
battery_service.register()
|
||||
|
||||
# In update loop:
|
||||
battery_service.update()
|
||||
"""
|
||||
|
||||
service_type = "battery"
|
||||
product_name = "Raymarine Battery Monitor"
|
||||
product_id = 0xA143 # Custom product ID for Raymarine Battery
|
||||
|
||||
# Maximum age in seconds before data is considered stale
|
||||
MAX_DATA_AGE = 30.0
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
sensor_data: SensorData,
|
||||
battery_id: int,
|
||||
device_instance: int = 0,
|
||||
battery_config: Optional[BatteryInfo] = None,
|
||||
custom_name: Optional[str] = None,
|
||||
):
|
||||
"""Initialize Battery service.
|
||||
|
||||
Args:
|
||||
sensor_data: SensorData instance to read values from
|
||||
battery_id: The Raymarine battery ID
|
||||
device_instance: Unique instance number for this battery
|
||||
battery_config: Optional BatteryInfo override
|
||||
custom_name: Optional custom display name
|
||||
"""
|
||||
# Get battery configuration
|
||||
if battery_config:
|
||||
self._battery_config = battery_config
|
||||
elif battery_id in BATTERY_CONFIG:
|
||||
self._battery_config = BATTERY_CONFIG[battery_id]
|
||||
else:
|
||||
self._battery_config = BatteryInfo(f"Battery #{battery_id}", 12.0, "house")
|
||||
|
||||
# Use battery name as product name
|
||||
self.product_name = self._battery_config.name
|
||||
|
||||
super().__init__(
|
||||
device_instance=device_instance,
|
||||
connection=f"Raymarine Battery {battery_id}",
|
||||
custom_name=custom_name or self._battery_config.name,
|
||||
)
|
||||
self._sensor_data = sensor_data
|
||||
self._battery_id = battery_id
|
||||
|
||||
# Track when voltage first dropped below threshold for delayed alert
|
||||
self._low_voltage_since: Optional[float] = None
|
||||
self._low_voltage_warned = False # Tracks if warning (level 1) was logged
|
||||
self._low_voltage_alerted = False # Tracks if alarm (level 2) was logged
|
||||
|
||||
# Track when voltage first exceeded threshold for delayed alert
|
||||
self._high_voltage_since: Optional[float] = None
|
||||
self._high_voltage_warned = False # Tracks if warning (level 1) was logged
|
||||
self._high_voltage_alerted = False # Tracks if alarm (level 2) was logged
|
||||
|
||||
@property
|
||||
def service_name(self) -> str:
|
||||
"""Get the full D-Bus service name."""
|
||||
return f"com.victronenergy.battery.raymarine_bat{self._battery_id}_{self.device_instance}"
|
||||
|
||||
def _get_paths(self) -> Dict[str, Dict[str, Any]]:
|
||||
"""Return battery-specific D-Bus paths."""
|
||||
return {
|
||||
# DC measurements
|
||||
'/Dc/0/Voltage': {'initial': None},
|
||||
'/Dc/0/Current': {'initial': None},
|
||||
'/Dc/0/Power': {'initial': None},
|
||||
'/Dc/0/Temperature': {'initial': None},
|
||||
|
||||
# State of charge (not available from Raymarine)
|
||||
'/Soc': {'initial': None},
|
||||
'/TimeToGo': {'initial': None},
|
||||
'/ConsumedAmphours': {'initial': None},
|
||||
|
||||
# Alarms (not monitored via Raymarine)
|
||||
'/Alarms/LowVoltage': {'initial': 0},
|
||||
'/Alarms/HighVoltage': {'initial': 0},
|
||||
'/Alarms/LowSoc': {'initial': None},
|
||||
'/Alarms/LowTemperature': {'initial': None},
|
||||
'/Alarms/HighTemperature': {'initial': None},
|
||||
|
||||
# Settings
|
||||
'/Settings/HasTemperature': {'initial': 0},
|
||||
'/Settings/HasStarterVoltage': {'initial': 0},
|
||||
'/Settings/HasMidVoltage': {'initial': 0},
|
||||
}
|
||||
|
||||
def _update(self) -> None:
|
||||
"""Update battery values from sensor data."""
|
||||
data = self._sensor_data
|
||||
now = time.time()
|
||||
|
||||
# Check data freshness
|
||||
is_stale = data.is_stale('battery', self.MAX_DATA_AGE)
|
||||
|
||||
# Get battery voltage from live data
|
||||
voltage = data.batteries.get(self._battery_id)
|
||||
|
||||
if voltage is not None and not is_stale:
|
||||
# Valid battery data
|
||||
self._set_value('/Dc/0/Voltage', round(voltage, 2))
|
||||
|
||||
# Estimate SOC for 24V AGM batteries
|
||||
nominal = self._battery_config.nominal_voltage
|
||||
if nominal == 24.0:
|
||||
soc = estimate_soc_24v_agm(voltage)
|
||||
self._set_value('/Soc', soc)
|
||||
else:
|
||||
self._set_value('/Soc', None)
|
||||
|
||||
# Low voltage alert with hysteresis
|
||||
# Tiered alerts prevent false alarms from temporary voltage drops (e.g., engine start)
|
||||
# - 0-60 seconds: No alarm (monitoring)
|
||||
# - 60-300 seconds: Warning (level 1)
|
||||
# - 300+ seconds: Alarm (level 2)
|
||||
if voltage < LOW_VOLTAGE_ALERT_THRESHOLD:
|
||||
if self._low_voltage_since is None:
|
||||
# First time below threshold - start tracking but don't alert yet
|
||||
self._low_voltage_since = now
|
||||
logger.info(
|
||||
f"{self._battery_config.name}: Voltage {voltage:.1f}V dropped below "
|
||||
f"{LOW_VOLTAGE_ALERT_THRESHOLD}V threshold, monitoring..."
|
||||
)
|
||||
|
||||
elapsed = now - self._low_voltage_since
|
||||
if elapsed >= LOW_VOLTAGE_ALARM_DELAY:
|
||||
# Below threshold for 5+ minutes - ALARM (level 2)
|
||||
self._set_value('/Alarms/LowVoltage', 2)
|
||||
if not self._low_voltage_alerted:
|
||||
logger.error(
|
||||
f"ALARM: {self._battery_config.name} voltage {voltage:.1f}V "
|
||||
f"has been below {LOW_VOLTAGE_ALERT_THRESHOLD}V for "
|
||||
f"{elapsed:.0f} seconds!"
|
||||
)
|
||||
self._low_voltage_alerted = True
|
||||
elif elapsed >= LOW_VOLTAGE_WARNING_DELAY:
|
||||
# Below threshold for 1-5 minutes - WARNING (level 1)
|
||||
self._set_value('/Alarms/LowVoltage', 1)
|
||||
if not self._low_voltage_warned:
|
||||
logger.warning(
|
||||
f"WARNING: {self._battery_config.name} voltage {voltage:.1f}V "
|
||||
f"has been below {LOW_VOLTAGE_ALERT_THRESHOLD}V for "
|
||||
f"{elapsed:.0f} seconds"
|
||||
)
|
||||
self._low_voltage_warned = True
|
||||
else:
|
||||
# Below threshold but within hysteresis period - no alarm yet
|
||||
self._set_value('/Alarms/LowVoltage', 0)
|
||||
else:
|
||||
# Voltage is OK - reset tracking
|
||||
if self._low_voltage_since is not None:
|
||||
logger.info(
|
||||
f"{self._battery_config.name}: Voltage recovered to {voltage:.1f}V"
|
||||
)
|
||||
self._low_voltage_since = None
|
||||
self._low_voltage_warned = False
|
||||
self._low_voltage_alerted = False
|
||||
self._set_value('/Alarms/LowVoltage', 0)
|
||||
|
||||
# High voltage alert with hysteresis
|
||||
# Tiered alerts prevent false alarms from temporary voltage spikes (e.g., charging)
|
||||
# - 0-60 seconds: No alarm (monitoring)
|
||||
# - 60-300 seconds: Warning (level 1)
|
||||
# - 300+ seconds: Alarm (level 2)
|
||||
if voltage > HIGH_VOLTAGE_ALERT_THRESHOLD:
|
||||
if self._high_voltage_since is None:
|
||||
# First time above threshold - start tracking but don't alert yet
|
||||
self._high_voltage_since = now
|
||||
logger.info(
|
||||
f"{self._battery_config.name}: Voltage {voltage:.1f}V exceeded "
|
||||
f"{HIGH_VOLTAGE_ALERT_THRESHOLD}V threshold, monitoring..."
|
||||
)
|
||||
|
||||
elapsed = now - self._high_voltage_since
|
||||
if elapsed >= HIGH_VOLTAGE_ALARM_DELAY:
|
||||
# Above threshold for 5+ minutes - ALARM (level 2)
|
||||
self._set_value('/Alarms/HighVoltage', 2)
|
||||
if not self._high_voltage_alerted:
|
||||
logger.error(
|
||||
f"ALARM: {self._battery_config.name} voltage {voltage:.1f}V "
|
||||
f"has been above {HIGH_VOLTAGE_ALERT_THRESHOLD}V for "
|
||||
f"{elapsed:.0f} seconds!"
|
||||
)
|
||||
self._high_voltage_alerted = True
|
||||
elif elapsed >= HIGH_VOLTAGE_WARNING_DELAY:
|
||||
# Above threshold for 1-5 minutes - WARNING (level 1)
|
||||
self._set_value('/Alarms/HighVoltage', 1)
|
||||
if not self._high_voltage_warned:
|
||||
logger.warning(
|
||||
f"WARNING: {self._battery_config.name} voltage {voltage:.1f}V "
|
||||
f"has been above {HIGH_VOLTAGE_ALERT_THRESHOLD}V for "
|
||||
f"{elapsed:.0f} seconds"
|
||||
)
|
||||
self._high_voltage_warned = True
|
||||
else:
|
||||
# Above threshold but within hysteresis period - no alarm yet
|
||||
self._set_value('/Alarms/HighVoltage', 0)
|
||||
else:
|
||||
# Voltage is OK - reset tracking
|
||||
if self._high_voltage_since is not None:
|
||||
logger.info(
|
||||
f"{self._battery_config.name}: Voltage recovered to {voltage:.1f}V"
|
||||
)
|
||||
self._high_voltage_since = None
|
||||
self._high_voltage_warned = False
|
||||
self._high_voltage_alerted = False
|
||||
self._set_value('/Alarms/HighVoltage', 0)
|
||||
|
||||
else:
|
||||
# No data or stale - show as unavailable
|
||||
self._set_value('/Dc/0/Voltage', None)
|
||||
self._set_value('/Soc', None)
|
||||
self._set_value('/Alarms/LowVoltage', 0)
|
||||
self._set_value('/Alarms/HighVoltage', 0)
|
||||
# Reset voltage tracking when data is stale
|
||||
self._low_voltage_since = None
|
||||
self._low_voltage_warned = False
|
||||
self._low_voltage_alerted = False
|
||||
self._high_voltage_since = None
|
||||
self._high_voltage_warned = False
|
||||
self._high_voltage_alerted = False
|
||||
|
||||
# These are not available from Raymarine
|
||||
self._set_value('/Dc/0/Current', None)
|
||||
self._set_value('/Dc/0/Power', None)
|
||||
self._set_value('/Dc/0/Temperature', None)
|
||||
self._set_value('/TimeToGo', None)
|
||||
self._set_value('/ConsumedAmphours', None)
|
||||
|
||||
# Update connection status
|
||||
self.set_connected(voltage is not None and not is_stale)
|
||||
|
||||
|
||||
def create_battery_services(
|
||||
sensor_data: SensorData,
|
||||
battery_ids: Optional[List[int]] = None,
|
||||
start_instance: int = 0,
|
||||
) -> List[BatteryService]:
|
||||
"""Create BatteryService instances for multiple batteries.
|
||||
|
||||
Args:
|
||||
sensor_data: SensorData instance to read values from
|
||||
battery_ids: List of battery IDs to create services for.
|
||||
If None, creates services for all configured batteries.
|
||||
start_instance: Starting device instance number
|
||||
|
||||
Returns:
|
||||
List of BatteryService instances
|
||||
"""
|
||||
if battery_ids is None:
|
||||
battery_ids = list(BATTERY_CONFIG.keys())
|
||||
|
||||
services = []
|
||||
for i, battery_id in enumerate(battery_ids):
|
||||
service = BatteryService(
|
||||
sensor_data=sensor_data,
|
||||
battery_id=battery_id,
|
||||
device_instance=start_instance + i,
|
||||
)
|
||||
services.append(service)
|
||||
|
||||
return services
|
||||
134
axiom-nmea/raymarine_nmea/venus_dbus/gps.py
Normal file
134
axiom-nmea/raymarine_nmea/venus_dbus/gps.py
Normal file
@@ -0,0 +1,134 @@
|
||||
"""
|
||||
GPS D-Bus service for Venus OS.
|
||||
|
||||
Publishes GPS position, speed, and course data to the Venus OS D-Bus
|
||||
using the com.victronenergy.gps service type.
|
||||
|
||||
D-Bus paths:
|
||||
/Altitude - Height in meters
|
||||
/Course - Direction in degrees (COG)
|
||||
/Fix - GPS fix status (0=no fix, 1=fix)
|
||||
/NrOfSatellites - Number of satellites (not available from Raymarine)
|
||||
/Position/Latitude - Latitude in degrees
|
||||
/Position/Longitude - Longitude in degrees
|
||||
/Speed - Speed in m/s (SOG)
|
||||
"""
|
||||
|
||||
import logging
|
||||
from typing import Any, Dict, Optional
|
||||
|
||||
from .service import VeDbusServiceBase
|
||||
from ..data.store import SensorData
|
||||
from ..protocol.constants import MS_TO_KTS
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Conversion: knots to m/s
|
||||
KTS_TO_MS = 1.0 / MS_TO_KTS # Approximately 0.514444
|
||||
|
||||
|
||||
class GpsService(VeDbusServiceBase):
|
||||
"""GPS D-Bus service for Venus OS.
|
||||
|
||||
Publishes GPS position, speed, and course from Raymarine sensors
|
||||
to the Venus OS D-Bus.
|
||||
|
||||
Example:
|
||||
sensor_data = SensorData()
|
||||
gps_service = GpsService(sensor_data)
|
||||
gps_service.register()
|
||||
|
||||
# In update loop:
|
||||
gps_service.update()
|
||||
"""
|
||||
|
||||
service_type = "gps"
|
||||
product_name = "Raymarine GPS"
|
||||
product_id = 0xA140 # Custom product ID for Raymarine GPS
|
||||
|
||||
# Maximum age in seconds before GPS data is considered stale
|
||||
MAX_DATA_AGE = 10.0
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
sensor_data: SensorData,
|
||||
device_instance: int = 0,
|
||||
custom_name: Optional[str] = None,
|
||||
):
|
||||
"""Initialize GPS service.
|
||||
|
||||
Args:
|
||||
sensor_data: SensorData instance to read GPS values from
|
||||
device_instance: Unique instance number (default: 0)
|
||||
custom_name: Optional custom display name
|
||||
"""
|
||||
super().__init__(
|
||||
device_instance=device_instance,
|
||||
connection="Raymarine LightHouse GPS",
|
||||
custom_name=custom_name,
|
||||
)
|
||||
self._sensor_data = sensor_data
|
||||
|
||||
def _get_paths(self) -> Dict[str, Dict[str, Any]]:
|
||||
"""Return GPS-specific D-Bus paths."""
|
||||
return {
|
||||
'/Altitude': {'initial': None},
|
||||
'/Course': {'initial': None},
|
||||
'/Fix': {'initial': 0},
|
||||
'/NrOfSatellites': {'initial': None},
|
||||
'/Position/Latitude': {'initial': None},
|
||||
'/Position/Longitude': {'initial': None},
|
||||
'/Speed': {'initial': None},
|
||||
}
|
||||
|
||||
def _update(self) -> None:
|
||||
"""Update GPS values from sensor data."""
|
||||
data = self._sensor_data
|
||||
|
||||
# Check if we have valid GPS data
|
||||
has_position = (
|
||||
data.latitude is not None and
|
||||
data.longitude is not None
|
||||
)
|
||||
|
||||
# Check data freshness
|
||||
is_stale = data.is_stale('gps', self.MAX_DATA_AGE)
|
||||
|
||||
if has_position and not is_stale:
|
||||
# Valid GPS fix
|
||||
self._set_value('/Fix', 1)
|
||||
self._set_value('/Position/Latitude', data.latitude)
|
||||
self._set_value('/Position/Longitude', data.longitude)
|
||||
|
||||
# Course over ground (degrees)
|
||||
if data.cog_deg is not None:
|
||||
self._set_value('/Course', round(data.cog_deg, 1))
|
||||
else:
|
||||
self._set_value('/Course', None)
|
||||
|
||||
# Speed over ground (convert knots to m/s)
|
||||
if data.sog_kts is not None:
|
||||
speed_ms = data.sog_kts * KTS_TO_MS
|
||||
self._set_value('/Speed', round(speed_ms, 2))
|
||||
else:
|
||||
self._set_value('/Speed', None)
|
||||
|
||||
# Altitude not available from Raymarine multicast
|
||||
# (would need NMEA GGA sentence with altitude field)
|
||||
self._set_value('/Altitude', None)
|
||||
|
||||
# Number of satellites not available from Raymarine
|
||||
self._set_value('/NrOfSatellites', None)
|
||||
|
||||
else:
|
||||
# No GPS fix or stale data
|
||||
self._set_value('/Fix', 0)
|
||||
self._set_value('/Position/Latitude', None)
|
||||
self._set_value('/Position/Longitude', None)
|
||||
self._set_value('/Course', None)
|
||||
self._set_value('/Speed', None)
|
||||
self._set_value('/Altitude', None)
|
||||
self._set_value('/NrOfSatellites', None)
|
||||
|
||||
# Update connection status
|
||||
self.set_connected(not is_stale and has_position)
|
||||
151
axiom-nmea/raymarine_nmea/venus_dbus/meteo.py
Normal file
151
axiom-nmea/raymarine_nmea/venus_dbus/meteo.py
Normal file
@@ -0,0 +1,151 @@
|
||||
"""
|
||||
Meteo (Weather) D-Bus service for Venus OS.
|
||||
|
||||
Publishes wind and environmental data to the Venus OS D-Bus
|
||||
using the com.victronenergy.meteo service type.
|
||||
|
||||
D-Bus paths:
|
||||
/WindDirection - True wind direction in degrees (0-360)
|
||||
/WindSpeed - True wind speed in m/s (Venus OS requirement)
|
||||
/ExternalTemperature - Air temperature in Celsius
|
||||
/CellTemperature - Not used (panel temperature for solar)
|
||||
/Irradiance - Not used (solar irradiance)
|
||||
"""
|
||||
|
||||
import logging
|
||||
from typing import Any, Dict, Optional
|
||||
|
||||
from .service import VeDbusServiceBase
|
||||
from ..data.store import SensorData
|
||||
from ..protocol.constants import MS_TO_KTS
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Conversion: knots to m/s
|
||||
KTS_TO_MS = 1.0 / MS_TO_KTS # Approximately 0.514444
|
||||
|
||||
|
||||
class MeteoService(VeDbusServiceBase):
|
||||
"""Meteo (Weather) D-Bus service for Venus OS.
|
||||
|
||||
Publishes wind direction, wind speed, and temperature from
|
||||
Raymarine sensors to the Venus OS D-Bus.
|
||||
|
||||
Example:
|
||||
sensor_data = SensorData()
|
||||
meteo_service = MeteoService(sensor_data)
|
||||
meteo_service.register()
|
||||
|
||||
# In update loop:
|
||||
meteo_service.update()
|
||||
"""
|
||||
|
||||
service_type = "meteo"
|
||||
product_name = "Weather"
|
||||
product_id = 0xA141 # Custom product ID for Raymarine Meteo
|
||||
|
||||
# Maximum age in seconds before data is considered stale
|
||||
MAX_DATA_AGE = 10.0
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
sensor_data: SensorData,
|
||||
device_instance: int = 0,
|
||||
custom_name: Optional[str] = None,
|
||||
):
|
||||
"""Initialize Meteo service.
|
||||
|
||||
Args:
|
||||
sensor_data: SensorData instance to read values from
|
||||
device_instance: Unique instance number (default: 0)
|
||||
custom_name: Optional custom display name
|
||||
"""
|
||||
super().__init__(
|
||||
device_instance=device_instance,
|
||||
connection="Raymarine LightHouse Weather",
|
||||
custom_name=custom_name,
|
||||
)
|
||||
self._sensor_data = sensor_data
|
||||
|
||||
def _get_paths(self) -> Dict[str, Dict[str, Any]]:
|
||||
"""Return meteo-specific D-Bus paths."""
|
||||
return {
|
||||
# Standard meteo paths
|
||||
'/WindDirection': {'initial': None},
|
||||
'/WindSpeed': {'initial': None},
|
||||
'/ExternalTemperature': {'initial': None},
|
||||
'/CellTemperature': {'initial': None},
|
||||
'/Irradiance': {'initial': None},
|
||||
'/ErrorCode': {'initial': 0},
|
||||
|
||||
# Extended paths for apparent wind
|
||||
# These may not be recognized by all Venus OS components
|
||||
# but are useful for custom dashboards
|
||||
'/ApparentWindAngle': {'initial': None},
|
||||
'/ApparentWindSpeed': {'initial': None},
|
||||
|
||||
# Barometric pressure (custom extension)
|
||||
'/Pressure': {'initial': None},
|
||||
}
|
||||
|
||||
def _update(self) -> None:
|
||||
"""Update meteo values from sensor data."""
|
||||
data = self._sensor_data
|
||||
|
||||
# Check data freshness
|
||||
wind_stale = data.is_stale('wind', self.MAX_DATA_AGE)
|
||||
temp_stale = data.is_stale('temp', self.MAX_DATA_AGE)
|
||||
pressure_stale = data.is_stale('pressure', self.MAX_DATA_AGE)
|
||||
|
||||
# True wind direction (degrees 0-360)
|
||||
if not wind_stale and data.twd_deg is not None:
|
||||
self._set_value('/WindDirection', round(data.twd_deg, 1))
|
||||
else:
|
||||
self._set_value('/WindDirection', None)
|
||||
|
||||
# True wind speed (convert knots to m/s for Venus OS)
|
||||
if not wind_stale and data.tws_kts is not None:
|
||||
speed_ms = data.tws_kts * KTS_TO_MS
|
||||
self._set_value('/WindSpeed', round(speed_ms, 2))
|
||||
else:
|
||||
self._set_value('/WindSpeed', None)
|
||||
|
||||
# Apparent wind angle (degrees, relative to bow)
|
||||
if not wind_stale and data.awa_deg is not None:
|
||||
self._set_value('/ApparentWindAngle', round(data.awa_deg, 1))
|
||||
else:
|
||||
self._set_value('/ApparentWindAngle', None)
|
||||
|
||||
# Apparent wind speed (convert knots to m/s for Venus OS)
|
||||
if not wind_stale and data.aws_kts is not None:
|
||||
speed_ms = data.aws_kts * KTS_TO_MS
|
||||
self._set_value('/ApparentWindSpeed', round(speed_ms, 2))
|
||||
else:
|
||||
self._set_value('/ApparentWindSpeed', None)
|
||||
|
||||
# Air temperature (Celsius)
|
||||
if not temp_stale and data.air_temp_c is not None:
|
||||
self._set_value('/ExternalTemperature', round(data.air_temp_c, 1))
|
||||
else:
|
||||
self._set_value('/ExternalTemperature', None)
|
||||
|
||||
# Barometric pressure (convert mbar to hPa - they're equivalent)
|
||||
if not pressure_stale and data.pressure_mbar is not None:
|
||||
self._set_value('/Pressure', round(data.pressure_mbar, 1))
|
||||
else:
|
||||
self._set_value('/Pressure', None)
|
||||
|
||||
# Cell temperature and irradiance are not available
|
||||
self._set_value('/CellTemperature', None)
|
||||
self._set_value('/Irradiance', None)
|
||||
|
||||
# Error code: 0 = OK
|
||||
has_any_data = (
|
||||
(not wind_stale and data.twd_deg is not None) or
|
||||
(not temp_stale and data.air_temp_c is not None) or
|
||||
(not pressure_stale and data.pressure_mbar is not None)
|
||||
)
|
||||
self._set_value('/ErrorCode', 0 if has_any_data else 1)
|
||||
|
||||
# Update connection status
|
||||
self.set_connected(not wind_stale)
|
||||
102
axiom-nmea/raymarine_nmea/venus_dbus/navigation.py
Normal file
102
axiom-nmea/raymarine_nmea/venus_dbus/navigation.py
Normal file
@@ -0,0 +1,102 @@
|
||||
"""
|
||||
Navigation D-Bus service for Venus OS.
|
||||
|
||||
Publishes navigation data not covered by the standard GPS or Meteo
|
||||
services to the Venus OS D-Bus, making it available to custom addons
|
||||
via D-Bus and MQTT.
|
||||
|
||||
D-Bus paths:
|
||||
/Heading - True heading in degrees (0-360)
|
||||
/Depth - Depth below transducer in meters
|
||||
/WaterTemperature - Water temperature in Celsius
|
||||
"""
|
||||
|
||||
import logging
|
||||
from typing import Any, Dict, Optional
|
||||
|
||||
from .service import VeDbusServiceBase
|
||||
from ..data.store import SensorData
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class NavigationService(VeDbusServiceBase):
|
||||
"""Navigation D-Bus service for Venus OS.
|
||||
|
||||
Publishes heading, depth, and water temperature from Raymarine
|
||||
sensors to the Venus OS D-Bus. These values are decoded from the
|
||||
Raymarine protocol but don't map to standard Venus OS service types,
|
||||
so they are grouped under a custom 'navigation' service.
|
||||
|
||||
Example:
|
||||
sensor_data = SensorData()
|
||||
nav_service = NavigationService(sensor_data)
|
||||
nav_service.register()
|
||||
|
||||
# In update loop:
|
||||
nav_service.update()
|
||||
"""
|
||||
|
||||
service_type = "navigation"
|
||||
product_name = "Raymarine Navigation"
|
||||
product_id = 0xA143
|
||||
|
||||
MAX_DATA_AGE = 10.0
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
sensor_data: SensorData,
|
||||
device_instance: int = 0,
|
||||
custom_name: Optional[str] = None,
|
||||
):
|
||||
"""Initialize Navigation service.
|
||||
|
||||
Args:
|
||||
sensor_data: SensorData instance to read values from
|
||||
device_instance: Unique instance number (default: 0)
|
||||
custom_name: Optional custom display name
|
||||
"""
|
||||
super().__init__(
|
||||
device_instance=device_instance,
|
||||
connection="Raymarine LightHouse Navigation",
|
||||
custom_name=custom_name,
|
||||
)
|
||||
self._sensor_data = sensor_data
|
||||
|
||||
def _get_paths(self) -> Dict[str, Dict[str, Any]]:
|
||||
"""Return navigation-specific D-Bus paths."""
|
||||
return {
|
||||
'/Heading': {'initial': None},
|
||||
'/Depth': {'initial': None},
|
||||
'/WaterTemperature': {'initial': None},
|
||||
}
|
||||
|
||||
def _update(self) -> None:
|
||||
"""Update navigation values from sensor data."""
|
||||
data = self._sensor_data
|
||||
|
||||
heading_stale = data.is_stale('heading', self.MAX_DATA_AGE)
|
||||
depth_stale = data.is_stale('depth', self.MAX_DATA_AGE)
|
||||
temp_stale = data.is_stale('temp', self.MAX_DATA_AGE)
|
||||
|
||||
if not heading_stale and data.heading_deg is not None:
|
||||
self._set_value('/Heading', round(data.heading_deg, 1))
|
||||
else:
|
||||
self._set_value('/Heading', None)
|
||||
|
||||
if not depth_stale and data.depth_m is not None:
|
||||
self._set_value('/Depth', round(data.depth_m, 2))
|
||||
else:
|
||||
self._set_value('/Depth', None)
|
||||
|
||||
if not temp_stale and data.water_temp_c is not None:
|
||||
self._set_value('/WaterTemperature', round(data.water_temp_c, 1))
|
||||
else:
|
||||
self._set_value('/WaterTemperature', None)
|
||||
|
||||
has_any_data = (
|
||||
(not heading_stale and data.heading_deg is not None) or
|
||||
(not depth_stale and data.depth_m is not None) or
|
||||
(not temp_stale and data.water_temp_c is not None)
|
||||
)
|
||||
self.set_connected(has_any_data)
|
||||
290
axiom-nmea/raymarine_nmea/venus_dbus/publisher.py
Normal file
290
axiom-nmea/raymarine_nmea/venus_dbus/publisher.py
Normal file
@@ -0,0 +1,290 @@
|
||||
"""
|
||||
Venus OS D-Bus Publisher.
|
||||
|
||||
This module provides the main VenusPublisher class that coordinates
|
||||
all D-Bus services for publishing Raymarine sensor data to Venus OS.
|
||||
"""
|
||||
|
||||
import logging
|
||||
import signal
|
||||
import sys
|
||||
from typing import List, Optional, Set
|
||||
|
||||
from ..data.store import SensorData
|
||||
from ..sensors import TANK_CONFIG, BATTERY_CONFIG
|
||||
from .gps import GpsService
|
||||
from .meteo import MeteoService
|
||||
from .navigation import NavigationService
|
||||
from .tank import TankService, create_tank_services
|
||||
from .battery import BatteryService, create_battery_services
|
||||
from .service import VeDbusServiceBase
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Try to import GLib for the main loop
|
||||
try:
|
||||
from gi.repository import GLib
|
||||
HAS_GLIB = True
|
||||
except ImportError:
|
||||
HAS_GLIB = False
|
||||
logger.warning("GLib not available. VenusPublisher.run() will not work.")
|
||||
|
||||
|
||||
class VenusPublisher:
|
||||
"""Coordinator for all Venus OS D-Bus services.
|
||||
|
||||
This class manages the lifecycle of GPS, Meteo, Navigation, Tank,
|
||||
and Battery D-Bus services, handling registration, updates, and cleanup.
|
||||
|
||||
Example:
|
||||
from raymarine_nmea import SensorData, RaymarineDecoder, MulticastListener
|
||||
from raymarine_nmea.venus_dbus import VenusPublisher
|
||||
|
||||
# Create sensor data store
|
||||
sensor_data = SensorData()
|
||||
decoder = RaymarineDecoder()
|
||||
|
||||
# Start multicast listener
|
||||
listener = MulticastListener(
|
||||
decoder=decoder,
|
||||
sensor_data=sensor_data,
|
||||
interface_ip="198.18.5.5",
|
||||
)
|
||||
listener.start()
|
||||
|
||||
# Start Venus OS publisher
|
||||
publisher = VenusPublisher(sensor_data)
|
||||
publisher.run() # Blocks until stopped
|
||||
|
||||
For more control over the main loop:
|
||||
publisher = VenusPublisher(sensor_data)
|
||||
publisher.start() # Non-blocking, registers services
|
||||
|
||||
# Your own main loop here
|
||||
# Call publisher.update() periodically
|
||||
|
||||
publisher.stop() # Cleanup
|
||||
"""
|
||||
|
||||
# Default update interval in milliseconds
|
||||
DEFAULT_UPDATE_INTERVAL_MS = 1000
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
sensor_data: SensorData,
|
||||
enable_gps: bool = True,
|
||||
enable_meteo: bool = True,
|
||||
enable_navigation: bool = True,
|
||||
enable_tanks: bool = True,
|
||||
enable_batteries: bool = True,
|
||||
tank_ids: Optional[List[int]] = None,
|
||||
battery_ids: Optional[List[int]] = None,
|
||||
update_interval_ms: int = DEFAULT_UPDATE_INTERVAL_MS,
|
||||
):
|
||||
"""Initialize Venus Publisher.
|
||||
|
||||
Args:
|
||||
sensor_data: SensorData instance to read values from
|
||||
enable_gps: Enable GPS service (default: True)
|
||||
enable_meteo: Enable Meteo/wind service (default: True)
|
||||
enable_navigation: Enable Navigation service (default: True)
|
||||
enable_tanks: Enable Tank services (default: True)
|
||||
enable_batteries: Enable Battery services (default: True)
|
||||
tank_ids: Specific tank IDs to publish (default: all configured)
|
||||
battery_ids: Specific battery IDs to publish (default: all configured)
|
||||
update_interval_ms: Update interval in milliseconds (default: 1000)
|
||||
"""
|
||||
self._sensor_data = sensor_data
|
||||
self._update_interval_ms = update_interval_ms
|
||||
self._services: List[VeDbusServiceBase] = []
|
||||
self._running = False
|
||||
self._mainloop = None
|
||||
self._timer_id = None
|
||||
|
||||
# Create enabled services
|
||||
if enable_gps:
|
||||
self._services.append(GpsService(sensor_data))
|
||||
|
||||
if enable_meteo:
|
||||
self._services.append(MeteoService(sensor_data))
|
||||
|
||||
if enable_navigation:
|
||||
self._services.append(NavigationService(sensor_data))
|
||||
|
||||
if enable_tanks:
|
||||
self._services.extend(
|
||||
create_tank_services(sensor_data, tank_ids)
|
||||
)
|
||||
|
||||
if enable_batteries:
|
||||
self._services.extend(
|
||||
create_battery_services(sensor_data, battery_ids)
|
||||
)
|
||||
|
||||
logger.info(f"VenusPublisher initialized with {len(self._services)} services")
|
||||
|
||||
def start(self) -> bool:
|
||||
"""Register all D-Bus services.
|
||||
|
||||
Returns:
|
||||
True if at least one service registered successfully
|
||||
"""
|
||||
if self._running:
|
||||
logger.warning("VenusPublisher already running")
|
||||
return True
|
||||
|
||||
registered = 0
|
||||
for service in self._services:
|
||||
if service.register():
|
||||
registered += 1
|
||||
else:
|
||||
logger.warning(f"Failed to register {service.service_name}")
|
||||
|
||||
self._running = registered > 0
|
||||
|
||||
if self._running:
|
||||
logger.info(f"VenusPublisher started: {registered}/{len(self._services)} services registered")
|
||||
else:
|
||||
logger.error("VenusPublisher failed to start: no services registered")
|
||||
|
||||
return self._running
|
||||
|
||||
def stop(self) -> None:
|
||||
"""Stop and unregister all D-Bus services."""
|
||||
if not self._running:
|
||||
return
|
||||
|
||||
# Stop timer if running in GLib main loop
|
||||
if self._timer_id is not None and HAS_GLIB:
|
||||
GLib.source_remove(self._timer_id)
|
||||
self._timer_id = None
|
||||
|
||||
# Quit main loop if running
|
||||
if self._mainloop is not None:
|
||||
self._mainloop.quit()
|
||||
self._mainloop = None
|
||||
|
||||
# Unregister all services
|
||||
for service in self._services:
|
||||
service.unregister()
|
||||
|
||||
self._running = False
|
||||
logger.info("VenusPublisher stopped")
|
||||
|
||||
def update(self) -> bool:
|
||||
"""Update all D-Bus services.
|
||||
|
||||
Call this periodically to refresh values.
|
||||
|
||||
Returns:
|
||||
True to continue updates, False if all services failed
|
||||
"""
|
||||
if not self._running:
|
||||
return False
|
||||
|
||||
success = 0
|
||||
for service in self._services:
|
||||
if service.update():
|
||||
success += 1
|
||||
|
||||
return success > 0
|
||||
|
||||
def run(self) -> None:
|
||||
"""Run the publisher with a GLib main loop.
|
||||
|
||||
This method blocks until the publisher is stopped via stop()
|
||||
or a SIGINT/SIGTERM signal is received.
|
||||
|
||||
Raises:
|
||||
RuntimeError: If GLib is not available
|
||||
"""
|
||||
if not HAS_GLIB:
|
||||
raise RuntimeError(
|
||||
"GLib is required to run VenusPublisher. "
|
||||
"Install PyGObject or use start()/update()/stop() manually."
|
||||
)
|
||||
|
||||
# Set up D-Bus main loop
|
||||
try:
|
||||
from dbus.mainloop.glib import DBusGMainLoop
|
||||
DBusGMainLoop(set_as_default=True)
|
||||
except ImportError:
|
||||
raise RuntimeError(
|
||||
"dbus-python with GLib support is required. "
|
||||
"Install python3-dbus on Venus OS."
|
||||
)
|
||||
|
||||
# Start services
|
||||
if not self.start():
|
||||
logger.error("Failed to start VenusPublisher")
|
||||
return
|
||||
|
||||
# Set up signal handlers for graceful shutdown
|
||||
def signal_handler(signum, frame):
|
||||
logger.info(f"Received signal {signum}, stopping...")
|
||||
self.stop()
|
||||
|
||||
signal.signal(signal.SIGINT, signal_handler)
|
||||
signal.signal(signal.SIGTERM, signal_handler)
|
||||
|
||||
# Set up periodic updates
|
||||
def update_callback():
|
||||
if not self._running:
|
||||
return False
|
||||
return self.update()
|
||||
|
||||
self._timer_id = GLib.timeout_add(
|
||||
self._update_interval_ms,
|
||||
update_callback
|
||||
)
|
||||
|
||||
# Run main loop
|
||||
logger.info("VenusPublisher running, press Ctrl+C to stop")
|
||||
self._mainloop = GLib.MainLoop()
|
||||
|
||||
try:
|
||||
self._mainloop.run()
|
||||
except Exception as e:
|
||||
logger.error(f"Main loop error: {e}")
|
||||
finally:
|
||||
self.stop()
|
||||
|
||||
@property
|
||||
def services(self) -> List[VeDbusServiceBase]:
|
||||
"""Get list of all managed services."""
|
||||
return self._services.copy()
|
||||
|
||||
@property
|
||||
def is_running(self) -> bool:
|
||||
"""Check if publisher is running."""
|
||||
return self._running
|
||||
|
||||
def add_service(self, service: VeDbusServiceBase) -> bool:
|
||||
"""Add a custom service to the publisher.
|
||||
|
||||
Args:
|
||||
service: Service instance to add
|
||||
|
||||
Returns:
|
||||
True if added (and registered if already running)
|
||||
"""
|
||||
self._services.append(service)
|
||||
|
||||
if self._running:
|
||||
return service.register()
|
||||
return True
|
||||
|
||||
def get_service_status(self) -> dict:
|
||||
"""Get status of all services.
|
||||
|
||||
Returns:
|
||||
Dict with service names and their registration status
|
||||
"""
|
||||
return {
|
||||
service.service_name: {
|
||||
'registered': service._registered,
|
||||
'type': service.service_type,
|
||||
'product': service.product_name,
|
||||
}
|
||||
for service in self._services
|
||||
}
|
||||
301
axiom-nmea/raymarine_nmea/venus_dbus/service.py
Normal file
301
axiom-nmea/raymarine_nmea/venus_dbus/service.py
Normal file
@@ -0,0 +1,301 @@
|
||||
"""
|
||||
Base D-Bus service class for Venus OS integration.
|
||||
|
||||
This module provides a base class that wraps the VeDbusService from
|
||||
Victron's velib_python library, following their standard patterns.
|
||||
"""
|
||||
|
||||
import logging
|
||||
import platform
|
||||
import os
|
||||
import sys
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import Any, Dict, Optional, Callable
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Venus OS stores velib_python in /opt/victronenergy
|
||||
VELIB_PATHS = [
|
||||
'/opt/victronenergy/dbus-systemcalc-py/ext/velib_python',
|
||||
'/opt/victronenergy/velib_python',
|
||||
os.path.join(os.path.dirname(__file__), '../../ext/velib_python'),
|
||||
]
|
||||
|
||||
# Try to import vedbus from velib_python
|
||||
VeDbusService = None
|
||||
dbusconnection = None
|
||||
for path in VELIB_PATHS:
|
||||
if os.path.exists(path):
|
||||
if path not in sys.path:
|
||||
sys.path.insert(0, path)
|
||||
try:
|
||||
from vedbus import VeDbusService
|
||||
# Also import dbusconnection for creating separate connections
|
||||
try:
|
||||
from dbusmonitor import DbusMonitor
|
||||
import dbus
|
||||
except ImportError:
|
||||
pass
|
||||
logger.debug(f"Loaded VeDbusService from {path}")
|
||||
break
|
||||
except ImportError:
|
||||
if path in sys.path:
|
||||
sys.path.remove(path)
|
||||
continue
|
||||
|
||||
if VeDbusService is None:
|
||||
logger.warning(
|
||||
"VeDbusService not available. Venus OS D-Bus publishing will be disabled. "
|
||||
"This is expected when not running on Venus OS."
|
||||
)
|
||||
|
||||
|
||||
def get_dbus_connection():
|
||||
"""Get a new private D-Bus connection.
|
||||
|
||||
Each service needs its own connection to avoid path conflicts.
|
||||
"""
|
||||
try:
|
||||
import dbus
|
||||
return dbus.SystemBus(private=True)
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to create D-Bus connection: {e}")
|
||||
return None
|
||||
|
||||
|
||||
class VeDbusServiceBase(ABC):
|
||||
"""Base class for Venus OS D-Bus services.
|
||||
|
||||
This class provides common functionality for creating D-Bus services
|
||||
that follow Victron's standards for Venus OS integration.
|
||||
|
||||
Subclasses must implement:
|
||||
- service_type: The service type (e.g., 'gps', 'tank', 'battery')
|
||||
- product_name: Human-readable product name
|
||||
- _get_paths(): Returns dict of D-Bus paths to register
|
||||
- _update(): Called periodically to update values
|
||||
|
||||
Example:
|
||||
class MyService(VeDbusServiceBase):
|
||||
service_type = 'myservice'
|
||||
product_name = 'My Custom Service'
|
||||
|
||||
def _get_paths(self):
|
||||
return {
|
||||
'/Value': {'initial': 0},
|
||||
'/Status': {'initial': 'OK'},
|
||||
}
|
||||
|
||||
def _update(self):
|
||||
self._set_value('/Value', 42)
|
||||
"""
|
||||
|
||||
# Subclasses must define these
|
||||
service_type: str = ""
|
||||
product_name: str = "Raymarine Sensor"
|
||||
product_id: int = 0xFFFF # Custom product ID
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
device_instance: int = 0,
|
||||
connection: str = "Raymarine LightHouse",
|
||||
custom_name: Optional[str] = None,
|
||||
):
|
||||
"""Initialize the D-Bus service.
|
||||
|
||||
Args:
|
||||
device_instance: Unique instance number for this device type
|
||||
connection: Connection description string
|
||||
custom_name: Optional custom name override
|
||||
"""
|
||||
self.device_instance = device_instance
|
||||
self.connection = connection
|
||||
self.custom_name = custom_name
|
||||
self._dbusservice = None
|
||||
self._bus = None # Private D-Bus connection for this service
|
||||
self._paths: Dict[str, Any] = {}
|
||||
self._registered = False
|
||||
|
||||
@property
|
||||
def service_name(self) -> str:
|
||||
"""Get the full D-Bus service name."""
|
||||
return f"com.victronenergy.{self.service_type}.raymarine_{self.device_instance}"
|
||||
|
||||
@abstractmethod
|
||||
def _get_paths(self) -> Dict[str, Dict[str, Any]]:
|
||||
"""Return the D-Bus paths to register.
|
||||
|
||||
Returns:
|
||||
Dict mapping path names to their settings.
|
||||
Each setting dict can contain:
|
||||
- initial: Initial value
|
||||
- writeable: Whether external writes are allowed (default: False)
|
||||
"""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def _update(self) -> None:
|
||||
"""Update the D-Bus values.
|
||||
|
||||
Called periodically to refresh values from sensor data.
|
||||
Use _set_value() to update individual paths.
|
||||
"""
|
||||
pass
|
||||
|
||||
def register(self) -> bool:
|
||||
"""Register the service with D-Bus.
|
||||
|
||||
Returns:
|
||||
True if registration succeeded, False otherwise
|
||||
"""
|
||||
if VeDbusService is None:
|
||||
logger.warning(f"Cannot register {self.service_name}: VeDbusService not available")
|
||||
return False
|
||||
|
||||
if self._registered:
|
||||
logger.warning(f"Service {self.service_name} already registered")
|
||||
return True
|
||||
|
||||
try:
|
||||
# Create a private D-Bus connection for this service
|
||||
# Each service needs its own connection to avoid path conflicts
|
||||
self._bus = get_dbus_connection()
|
||||
if self._bus is None:
|
||||
logger.error(f"Failed to get D-Bus connection for {self.service_name}")
|
||||
return False
|
||||
|
||||
# Create service with register=False (new API requirement)
|
||||
# This allows us to add all paths before registering
|
||||
self._dbusservice = VeDbusService(
|
||||
self.service_name,
|
||||
bus=self._bus,
|
||||
register=False
|
||||
)
|
||||
|
||||
# Create management objects
|
||||
self._dbusservice.add_path('/Mgmt/ProcessName', __file__)
|
||||
self._dbusservice.add_path(
|
||||
'/Mgmt/ProcessVersion',
|
||||
f'1.0 (Python {platform.python_version()})'
|
||||
)
|
||||
self._dbusservice.add_path('/Mgmt/Connection', self.connection)
|
||||
|
||||
# Create mandatory objects
|
||||
self._dbusservice.add_path('/DeviceInstance', self.device_instance)
|
||||
self._dbusservice.add_path('/ProductId', self.product_id)
|
||||
self._dbusservice.add_path(
|
||||
'/ProductName',
|
||||
self.custom_name or self.product_name
|
||||
)
|
||||
self._dbusservice.add_path('/FirmwareVersion', 1)
|
||||
self._dbusservice.add_path('/HardwareVersion', 0)
|
||||
self._dbusservice.add_path('/Connected', 1)
|
||||
|
||||
# Add custom name if supported
|
||||
if self.custom_name:
|
||||
self._dbusservice.add_path('/CustomName', self.custom_name)
|
||||
|
||||
# Register service-specific paths
|
||||
self._paths = self._get_paths()
|
||||
for path, settings in self._paths.items():
|
||||
initial = settings.get('initial', None)
|
||||
writeable = settings.get('writeable', False)
|
||||
self._dbusservice.add_path(
|
||||
path,
|
||||
initial,
|
||||
writeable=writeable,
|
||||
onchangecallback=self._handle_changed_value if writeable else None
|
||||
)
|
||||
|
||||
# Complete registration after all paths are added
|
||||
self._dbusservice.register()
|
||||
self._registered = True
|
||||
|
||||
logger.info(f"Registered D-Bus service: {self.service_name}")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to register {self.service_name}: {e}")
|
||||
return False
|
||||
|
||||
def unregister(self) -> None:
|
||||
"""Unregister the service from D-Bus."""
|
||||
if self._dbusservice and self._registered:
|
||||
# VeDbusService doesn't have a clean unregister method,
|
||||
# so we just mark ourselves as unregistered
|
||||
self._registered = False
|
||||
self._dbusservice = None
|
||||
# Close the private D-Bus connection
|
||||
if self._bus:
|
||||
try:
|
||||
self._bus.close()
|
||||
except Exception:
|
||||
pass
|
||||
self._bus = None
|
||||
logger.info(f"Unregistered D-Bus service: {self.service_name}")
|
||||
|
||||
def update(self) -> bool:
|
||||
"""Update the service values.
|
||||
|
||||
Returns:
|
||||
True to continue updates, False to stop
|
||||
"""
|
||||
if not self._registered:
|
||||
return False
|
||||
|
||||
try:
|
||||
self._update()
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating {self.service_name}: {e}")
|
||||
return True # Keep trying
|
||||
|
||||
def _set_value(self, path: str, value: Any) -> None:
|
||||
"""Set a D-Bus path value.
|
||||
|
||||
Args:
|
||||
path: The D-Bus path (e.g., '/Position/Latitude')
|
||||
value: The value to set
|
||||
"""
|
||||
if self._dbusservice and self._registered:
|
||||
with self._dbusservice as s:
|
||||
if path in s:
|
||||
s[path] = value
|
||||
|
||||
def _get_value(self, path: str) -> Any:
|
||||
"""Get a D-Bus path value.
|
||||
|
||||
Args:
|
||||
path: The D-Bus path
|
||||
|
||||
Returns:
|
||||
The current value, or None if not found
|
||||
"""
|
||||
if self._dbusservice and self._registered:
|
||||
with self._dbusservice as s:
|
||||
if path in s:
|
||||
return s[path]
|
||||
return None
|
||||
|
||||
def _handle_changed_value(self, path: str, value: Any) -> bool:
|
||||
"""Handle external value changes.
|
||||
|
||||
Override this method to handle writes from other processes.
|
||||
|
||||
Args:
|
||||
path: The D-Bus path that was changed
|
||||
value: The new value
|
||||
|
||||
Returns:
|
||||
True to accept the change, False to reject
|
||||
"""
|
||||
logger.debug(f"External change: {path} = {value}")
|
||||
return True
|
||||
|
||||
def set_connected(self, connected: bool) -> None:
|
||||
"""Set the connection status.
|
||||
|
||||
Args:
|
||||
connected: True if data source is connected
|
||||
"""
|
||||
self._set_value('/Connected', 1 if connected else 0)
|
||||
349
axiom-nmea/raymarine_nmea/venus_dbus/tank.py
Normal file
349
axiom-nmea/raymarine_nmea/venus_dbus/tank.py
Normal file
@@ -0,0 +1,349 @@
|
||||
"""
|
||||
Tank D-Bus service for Venus OS.
|
||||
|
||||
Publishes tank level data to the Venus OS D-Bus using the
|
||||
com.victronenergy.tank service type.
|
||||
|
||||
Each physical tank requires a separate D-Bus service instance.
|
||||
|
||||
D-Bus paths:
|
||||
/Level - Tank level 0-100%
|
||||
/Remaining - Remaining volume in m3
|
||||
/Status - 0=Ok; 1=Disconnected; 2=Short circuited; 3=Reverse polarity; 4=Unknown
|
||||
/Capacity - Tank capacity in m3
|
||||
/FluidType - 0=Fuel; 1=Fresh water; 2=Waste water; etc.
|
||||
/Standard - 2 (Not applicable for voltage/current sensors)
|
||||
|
||||
Water tanks (tank_type "water") also publish:
|
||||
/Alarms/LowLevel - 0=OK, 2=Alarm (level < 10% for 60s)
|
||||
/Alarms/LowLevelAck - Writable. 0=none, 1=acknowledged, 2=snoozed
|
||||
/Alarms/LowLevelSnoozeUntil - Writable. Unix timestamp when snooze expires
|
||||
|
||||
Black water tanks (tank_type "blackwater") also publish:
|
||||
/Alarms/HighLevel - 0=OK, 2=Alarm (level > 75% for 60s)
|
||||
/Alarms/HighLevelAck - Writable. 0=none, 1=acknowledged, 2=snoozed
|
||||
/Alarms/HighLevelSnoozeUntil - Writable. Unix timestamp when snooze expires
|
||||
"""
|
||||
|
||||
import logging
|
||||
import time
|
||||
from typing import Any, Dict, Optional, List, Tuple
|
||||
|
||||
from .service import VeDbusServiceBase
|
||||
from ..data.store import SensorData
|
||||
from ..sensors import TANK_CONFIG, TankInfo, get_tank_name, get_tank_capacity
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Alarm thresholds
|
||||
WATER_LOW_LEVEL_THRESHOLD = 10.0 # Alarm if water tank below this percentage
|
||||
BLACK_WATER_HIGH_LEVEL_THRESHOLD = 75.0 # Alarm if black water above this percentage
|
||||
TANK_ALARM_DELAY = 60.0 # Seconds before alarm triggers (avoids brief dips)
|
||||
|
||||
# Fuel tank IDs - these get cached when engines are off
|
||||
FUEL_TANK_IDS = {1, 2} # Fuel Starboard and Fuel Port
|
||||
|
||||
# Memory cache for fuel tank levels (persists while service runs)
|
||||
# Key: tank_id, Value: (level_percent, timestamp)
|
||||
_fuel_tank_cache: Dict[int, Tuple[float, float]] = {}
|
||||
|
||||
# Fluid type mapping from tank_type string to Victron enum
|
||||
FLUID_TYPE_MAP = {
|
||||
'fuel': 0, # Fuel
|
||||
'water': 1, # Fresh water
|
||||
'waste': 2, # Waste water
|
||||
'livewell': 3, # Live well
|
||||
'oil': 4, # Oil
|
||||
'blackwater': 5, # Black water (sewage)
|
||||
'gasoline': 6, # Gasoline
|
||||
'diesel': 7, # Diesel
|
||||
'lpg': 8, # Liquid Petroleum Gas
|
||||
'lng': 9, # Liquid Natural Gas
|
||||
'hydraulic': 10, # Hydraulic oil
|
||||
'rawwater': 11, # Raw water
|
||||
}
|
||||
|
||||
# Gallons to cubic meters
|
||||
GALLONS_TO_M3 = 0.00378541
|
||||
|
||||
|
||||
class TankService(VeDbusServiceBase):
|
||||
"""Tank D-Bus service for Venus OS.
|
||||
|
||||
Publishes a single tank's level data to the Venus OS D-Bus.
|
||||
Create one instance per physical tank.
|
||||
|
||||
Example:
|
||||
sensor_data = SensorData()
|
||||
|
||||
# Create service for tank ID 1
|
||||
tank_service = TankService(
|
||||
sensor_data=sensor_data,
|
||||
tank_id=1,
|
||||
device_instance=0,
|
||||
)
|
||||
tank_service.register()
|
||||
|
||||
# In update loop:
|
||||
tank_service.update()
|
||||
"""
|
||||
|
||||
service_type = "tank"
|
||||
product_name = "Raymarine Tank Sensor"
|
||||
product_id = 0xA142 # Custom product ID for Raymarine Tank
|
||||
|
||||
# Maximum age in seconds before data is considered stale
|
||||
MAX_DATA_AGE = 30.0
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
sensor_data: SensorData,
|
||||
tank_id: int,
|
||||
device_instance: int = 0,
|
||||
tank_config: Optional[TankInfo] = None,
|
||||
custom_name: Optional[str] = None,
|
||||
):
|
||||
"""Initialize Tank service.
|
||||
|
||||
Args:
|
||||
sensor_data: SensorData instance to read values from
|
||||
tank_id: The Raymarine tank ID
|
||||
device_instance: Unique instance number for this tank
|
||||
tank_config: Optional TankInfo override (otherwise uses TANK_CONFIG)
|
||||
custom_name: Optional custom display name
|
||||
"""
|
||||
# Get tank configuration
|
||||
if tank_config:
|
||||
self._tank_config = tank_config
|
||||
elif tank_id in TANK_CONFIG:
|
||||
self._tank_config = TANK_CONFIG[tank_id]
|
||||
else:
|
||||
self._tank_config = TankInfo(f"Tank #{tank_id}", None, "fuel")
|
||||
|
||||
# Use tank name as product name
|
||||
self.product_name = self._tank_config.name
|
||||
|
||||
super().__init__(
|
||||
device_instance=device_instance,
|
||||
connection=f"Raymarine Tank {tank_id}",
|
||||
custom_name=custom_name or self._tank_config.name,
|
||||
)
|
||||
self._sensor_data = sensor_data
|
||||
self._tank_id = tank_id
|
||||
|
||||
# Check if this is a fuel tank (should use caching)
|
||||
self._is_fuel_tank = tank_id in FUEL_TANK_IDS
|
||||
|
||||
# Alarm tracking -- only for water and blackwater tanks
|
||||
tank_type = self._tank_config.tank_type.lower()
|
||||
self._has_low_alarm = tank_type == 'water'
|
||||
self._has_high_alarm = tank_type == 'blackwater'
|
||||
self._alarm_since: Optional[float] = None
|
||||
self._alarm_logged = False
|
||||
|
||||
@property
|
||||
def service_name(self) -> str:
|
||||
"""Get the full D-Bus service name."""
|
||||
return f"com.victronenergy.tank.raymarine_tank{self._tank_id}_{self.device_instance}"
|
||||
|
||||
def _get_fluid_type(self) -> int:
|
||||
"""Get the Victron fluid type enum from tank config."""
|
||||
tank_type = self._tank_config.tank_type.lower()
|
||||
return FLUID_TYPE_MAP.get(tank_type, 0) # Default to fuel
|
||||
|
||||
def _get_capacity_m3(self) -> Optional[float]:
|
||||
"""Get tank capacity in cubic meters."""
|
||||
if self._tank_config.capacity_gallons is None:
|
||||
return None
|
||||
return self._tank_config.capacity_gallons * GALLONS_TO_M3
|
||||
|
||||
def _get_paths(self) -> Dict[str, Dict[str, Any]]:
|
||||
"""Return tank-specific D-Bus paths."""
|
||||
capacity_m3 = self._get_capacity_m3()
|
||||
|
||||
paths: Dict[str, Dict[str, Any]] = {
|
||||
'/Level': {'initial': None},
|
||||
'/Remaining': {'initial': None},
|
||||
'/Status': {'initial': 4}, # Unknown until data received
|
||||
'/Capacity': {'initial': capacity_m3},
|
||||
'/FluidType': {'initial': self._get_fluid_type()},
|
||||
'/Standard': {'initial': 2}, # Not applicable
|
||||
}
|
||||
|
||||
if self._has_low_alarm:
|
||||
paths['/Alarms/LowLevel'] = {'initial': 0}
|
||||
paths['/Alarms/LowLevelAck'] = {'initial': 0, 'writeable': True}
|
||||
paths['/Alarms/LowLevelSnoozeUntil'] = {'initial': 0, 'writeable': True}
|
||||
|
||||
if self._has_high_alarm:
|
||||
paths['/Alarms/HighLevel'] = {'initial': 0}
|
||||
paths['/Alarms/HighLevelAck'] = {'initial': 0, 'writeable': True}
|
||||
paths['/Alarms/HighLevelSnoozeUntil'] = {'initial': 0, 'writeable': True}
|
||||
|
||||
return paths
|
||||
|
||||
def _update(self) -> None:
|
||||
"""Update tank values from sensor data."""
|
||||
global _fuel_tank_cache
|
||||
|
||||
data = self._sensor_data
|
||||
now = time.time()
|
||||
|
||||
# Check data freshness
|
||||
is_stale = data.is_stale('tank', self.MAX_DATA_AGE)
|
||||
|
||||
# Get tank level from live data
|
||||
level = data.tanks.get(self._tank_id)
|
||||
using_cached = False
|
||||
|
||||
# For fuel tanks: cache when present, use cache when absent
|
||||
# Fuel doesn't change when engines are off
|
||||
if self._is_fuel_tank:
|
||||
if level is not None and not is_stale:
|
||||
# Live data available - cache it
|
||||
_fuel_tank_cache[self._tank_id] = (level, now)
|
||||
elif self._tank_id in _fuel_tank_cache:
|
||||
# No live data - use cached value
|
||||
cached_level, cached_time = _fuel_tank_cache[self._tank_id]
|
||||
level = cached_level
|
||||
using_cached = True
|
||||
# Don't consider cached data as stale for fuel tanks
|
||||
is_stale = False
|
||||
|
||||
if level is not None and not is_stale:
|
||||
# Valid tank data (live or cached)
|
||||
self._set_value('/Level', round(level, 1))
|
||||
self._set_value('/Status', 0) # OK
|
||||
|
||||
# Calculate remaining volume
|
||||
capacity_m3 = self._get_capacity_m3()
|
||||
if capacity_m3 is not None:
|
||||
remaining = capacity_m3 * (level / 100.0)
|
||||
self._set_value('/Remaining', round(remaining, 4))
|
||||
else:
|
||||
self._set_value('/Remaining', None)
|
||||
|
||||
# Check tank-level alarms (skip for cached values)
|
||||
if not using_cached:
|
||||
self._check_level_alarm(level, now)
|
||||
|
||||
else:
|
||||
# No data or stale
|
||||
self._set_value('/Level', None)
|
||||
self._set_value('/Remaining', None)
|
||||
|
||||
if is_stale and self._tank_id in data.tanks:
|
||||
# Had data but now stale
|
||||
self._set_value('/Status', 1) # Disconnected
|
||||
else:
|
||||
# Never had data
|
||||
self._set_value('/Status', 4) # Unknown
|
||||
|
||||
# Update connection status (cached values count as connected for fuel tanks)
|
||||
self.set_connected(level is not None and not is_stale)
|
||||
|
||||
def _check_level_alarm(self, level: float, now: float) -> None:
|
||||
"""Check tank level against alarm thresholds and manage alarm state.
|
||||
|
||||
For water tanks: alarms when level < WATER_LOW_LEVEL_THRESHOLD.
|
||||
For black water: alarms when level > BLACK_WATER_HIGH_LEVEL_THRESHOLD.
|
||||
Both use TANK_ALARM_DELAY before triggering.
|
||||
|
||||
Also handles snooze expiry: if snoozed (Ack=2) and the snooze time
|
||||
has passed, resets Ack to 0 so the UI re-alerts.
|
||||
"""
|
||||
if self._has_low_alarm:
|
||||
alarm_path = '/Alarms/LowLevel'
|
||||
ack_path = '/Alarms/LowLevelAck'
|
||||
snooze_path = '/Alarms/LowLevelSnoozeUntil'
|
||||
condition_met = level < WATER_LOW_LEVEL_THRESHOLD
|
||||
threshold_desc = f"below {WATER_LOW_LEVEL_THRESHOLD}%"
|
||||
elif self._has_high_alarm:
|
||||
alarm_path = '/Alarms/HighLevel'
|
||||
ack_path = '/Alarms/HighLevelAck'
|
||||
snooze_path = '/Alarms/HighLevelSnoozeUntil'
|
||||
condition_met = level > BLACK_WATER_HIGH_LEVEL_THRESHOLD
|
||||
threshold_desc = f"above {BLACK_WATER_HIGH_LEVEL_THRESHOLD}%"
|
||||
else:
|
||||
return
|
||||
|
||||
if condition_met:
|
||||
if self._alarm_since is None:
|
||||
self._alarm_since = now
|
||||
|
||||
elapsed = now - self._alarm_since
|
||||
|
||||
if elapsed >= TANK_ALARM_DELAY:
|
||||
self._set_value(alarm_path, 2)
|
||||
if not self._alarm_logged:
|
||||
logger.warning(
|
||||
f"ALARM: {self._tank_config.name} at {level:.1f}% - "
|
||||
f"{threshold_desc} for {elapsed:.0f}s"
|
||||
)
|
||||
self._alarm_logged = True
|
||||
|
||||
# Check snooze expiry: if snoozed and past expiry, re-trigger
|
||||
ack_val = self._get_value(ack_path)
|
||||
snooze_until = self._get_value(snooze_path)
|
||||
if ack_val == 2 and snooze_until and now > snooze_until:
|
||||
logger.info(
|
||||
f"Snooze expired for {self._tank_config.name}, "
|
||||
f"re-triggering alarm"
|
||||
)
|
||||
self._set_value(ack_path, 0)
|
||||
self._set_value(snooze_path, 0)
|
||||
else:
|
||||
# Condition cleared
|
||||
if self._alarm_since is not None:
|
||||
if self._alarm_logged:
|
||||
logger.info(
|
||||
f"{self._tank_config.name} recovered to {level:.1f}%"
|
||||
)
|
||||
self._alarm_since = None
|
||||
self._alarm_logged = False
|
||||
|
||||
self._set_value(alarm_path, 0)
|
||||
self._set_value(ack_path, 0)
|
||||
self._set_value(snooze_path, 0)
|
||||
|
||||
def _handle_changed_value(self, path: str, value: Any) -> bool:
|
||||
"""Accept external writes for alarm acknowledgement and snooze paths."""
|
||||
writable = {
|
||||
'/Alarms/LowLevelAck', '/Alarms/LowLevelSnoozeUntil',
|
||||
'/Alarms/HighLevelAck', '/Alarms/HighLevelSnoozeUntil',
|
||||
}
|
||||
if path in writable:
|
||||
logger.info(f"External write: {self._tank_config.name} {path} = {value}")
|
||||
return True
|
||||
return super()._handle_changed_value(path, value)
|
||||
|
||||
|
||||
def create_tank_services(
|
||||
sensor_data: SensorData,
|
||||
tank_ids: Optional[List[int]] = None,
|
||||
start_instance: int = 0,
|
||||
) -> List[TankService]:
|
||||
"""Create TankService instances for multiple tanks.
|
||||
|
||||
Args:
|
||||
sensor_data: SensorData instance to read values from
|
||||
tank_ids: List of tank IDs to create services for.
|
||||
If None, creates services for all configured tanks.
|
||||
start_instance: Starting device instance number
|
||||
|
||||
Returns:
|
||||
List of TankService instances
|
||||
"""
|
||||
if tank_ids is None:
|
||||
tank_ids = list(TANK_CONFIG.keys())
|
||||
|
||||
services = []
|
||||
for i, tank_id in enumerate(tank_ids):
|
||||
service = TankService(
|
||||
sensor_data=sensor_data,
|
||||
tank_id=tank_id,
|
||||
device_instance=start_instance + i,
|
||||
)
|
||||
services.append(service)
|
||||
|
||||
return services
|
||||
11
axiom-nmea/requirements.txt
Normal file
11
axiom-nmea/requirements.txt
Normal file
@@ -0,0 +1,11 @@
|
||||
# Raymarine NMEA Decoder Dependencies
|
||||
|
||||
# No external dependencies required - using only standard library
|
||||
# The script uses:
|
||||
# - socket (UDP multicast)
|
||||
# - struct (binary parsing)
|
||||
# - argparse (CLI arguments)
|
||||
# - json (JSON output)
|
||||
# - datetime (timestamps)
|
||||
# - threading (multi-group listener)
|
||||
# - collections (data buffering)
|
||||
0
axiom-nmea/samples/.gitkeep
Normal file
0
axiom-nmea/samples/.gitkeep
Normal file
40
axiom-nmea/samples/README.md
Normal file
40
axiom-nmea/samples/README.md
Normal file
@@ -0,0 +1,40 @@
|
||||
# Sample Packet Captures
|
||||
|
||||
This directory contains sample Raymarine network packet captures for testing and development.
|
||||
|
||||
## Expected Files
|
||||
|
||||
Place your `.pcap` capture files here:
|
||||
|
||||
| File | Description |
|
||||
|------|-------------|
|
||||
| `raymarine_sample.pcap` | General sample data with mixed sensor readings |
|
||||
| `raymarine_sample_TWD_62-70_HDG_29-35.pcap` | Capture with known TWD (62-70°) and Heading (29-35°) |
|
||||
| `raymarine_sample_twd_69-73.pcap` | Additional wind direction samples (TWD 69-73°) |
|
||||
|
||||
## Creating Captures
|
||||
|
||||
To capture Raymarine network traffic:
|
||||
|
||||
```bash
|
||||
# Using tcpdump (Linux/macOS)
|
||||
sudo tcpdump -i eth0 -w samples/raymarine_sample.pcap udp port 2565
|
||||
|
||||
# Capture from specific multicast group
|
||||
sudo tcpdump -i eth0 -w samples/raymarine_sample.pcap host 226.192.206.102
|
||||
```
|
||||
|
||||
## Using Captures
|
||||
|
||||
Most debug scripts accept a `--pcap` argument:
|
||||
|
||||
```bash
|
||||
# From project root
|
||||
python debug/protobuf_decoder.py --pcap samples/raymarine_sample.pcap
|
||||
python debug/raymarine_decoder.py --pcap samples/raymarine_sample.pcap
|
||||
python examples/pcap-to-nmea/pcap_to_nmea.py samples/raymarine_sample.pcap
|
||||
```
|
||||
|
||||
## Note
|
||||
|
||||
The `.pcap` files themselves are not committed to git (listed in `.gitignore`) to keep the repository size manageable. You'll need to create your own captures or obtain them separately.
|
||||
24
dbus-generator-ramp/.gitignore
vendored
Normal file
24
dbus-generator-ramp/.gitignore
vendored
Normal file
@@ -0,0 +1,24 @@
|
||||
# Build artifacts
|
||||
*.tar.gz
|
||||
*.sha256
|
||||
|
||||
# Python
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
*.so
|
||||
.Python
|
||||
*.egg
|
||||
*.egg-info/
|
||||
dist/
|
||||
build/
|
||||
|
||||
# IDE
|
||||
.vscode/
|
||||
.idea/
|
||||
*.swp
|
||||
*.swo
|
||||
*~
|
||||
|
||||
# Venus OS runtime (created during installation)
|
||||
ext/
|
||||
47
dbus-generator-ramp/Dockerfile
Normal file
47
dbus-generator-ramp/Dockerfile
Normal file
@@ -0,0 +1,47 @@
|
||||
# Dockerfile for Generator Current Ramp Controller Development
|
||||
#
|
||||
# This provides a development environment with D-Bus support for testing.
|
||||
# Note: Full integration testing requires a real Venus OS device.
|
||||
#
|
||||
# Usage:
|
||||
# docker-compose build
|
||||
# docker-compose run --rm dev python -m pytest tests/
|
||||
# docker-compose run --rm dev python overload_detector.py # Run unit tests
|
||||
#
|
||||
|
||||
FROM python:3.11-slim
|
||||
|
||||
# Install system dependencies
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
dbus \
|
||||
libdbus-1-dev \
|
||||
libdbus-glib-1-dev \
|
||||
libgirepository1.0-dev \
|
||||
gcc \
|
||||
pkg-config \
|
||||
python3-gi \
|
||||
python3-gi-cairo \
|
||||
gir1.2-glib-2.0 \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Install Python dependencies
|
||||
RUN pip install --no-cache-dir \
|
||||
dbus-python \
|
||||
PyGObject \
|
||||
numpy \
|
||||
pytest
|
||||
|
||||
# Create working directory
|
||||
WORKDIR /app
|
||||
|
||||
# Copy application code
|
||||
COPY . /app/
|
||||
|
||||
# Create ext directory for velib_python (will be mounted or mocked)
|
||||
RUN mkdir -p /app/ext/velib_python
|
||||
|
||||
# Set Python path
|
||||
ENV PYTHONPATH="/app:/app/ext/velib_python"
|
||||
|
||||
# Default command: run tests
|
||||
CMD ["python", "-m", "pytest", "-v"]
|
||||
462
dbus-generator-ramp/README.md
Normal file
462
dbus-generator-ramp/README.md
Normal file
@@ -0,0 +1,462 @@
|
||||
# Generator Current Ramp Controller for Venus OS
|
||||
|
||||
A Venus OS addon that dynamically controls the inverter/charger input current limit when running on generator power, preventing generator overload through intelligent ramping, automatic rollback, and adaptive learning.
|
||||
|
||||
## Features
|
||||
|
||||
- **Preemptive Protection**: Sets 40A limit when generator warm-up is detected
|
||||
- **Gradual Ramp**: Increases from 40A to 50A over 30 minutes
|
||||
- **Overload Detection**: Monitors power fluctuations to detect generator stress
|
||||
- **Fast Recovery**: Quickly ramps back to near-overload point, then slow ramps from there
|
||||
- **Rapid Overload Protection**: Increases safety margins when overloads occur in quick succession
|
||||
- **Output Power Correlation**: Learns relationship between output loads and safe input current
|
||||
- **Persistent Learning**: Model survives reboots, continuously improves over time
|
||||
|
||||
## How It Works
|
||||
|
||||
### Basic Flow
|
||||
|
||||
```
|
||||
Generator Starts → Warm-up (40A) → AC Connects → Ramp 40A→50A → Stable
|
||||
↓
|
||||
Overload detected
|
||||
↓
|
||||
Rollback to 40A
|
||||
↓
|
||||
5 min cooldown
|
||||
↓
|
||||
FAST ramp to (overload - 4A)
|
||||
↓
|
||||
SLOW ramp to recovery target
|
||||
```
|
||||
|
||||
### Fast Recovery Algorithm
|
||||
|
||||
When overload is detected, instead of slowly ramping from 40A all the way back up:
|
||||
|
||||
1. **Immediate rollback** to 40A (safety)
|
||||
2. **5-minute cooldown** to let generator stabilize
|
||||
3. **Fast ramp phase**: Quickly ramp at 5 A/min to `(overload_point - 4A)`
|
||||
4. **Slow ramp phase**: Then ramp at 0.5 A/min to the recovery target
|
||||
|
||||
**Example**: Overload at 48A
|
||||
|
||||
- Fast ramp: 40A → 44A in ~48 seconds
|
||||
- Slow ramp: 44A → 46A (recovery target) in ~4 minutes
|
||||
|
||||
### Rapid Overload Protection
|
||||
|
||||
If overload occurs again within 2 minutes, margins increase:
|
||||
|
||||
|
||||
| Overload | Fast Ramp Margin | Recovery Margin |
|
||||
| ----------- | ---------------- | --------------- |
|
||||
| 1st | 4A | 2A |
|
||||
| 2nd (rapid) | 6A | 4A |
|
||||
| 3rd (rapid) | 8A | 6A |
|
||||
|
||||
|
||||
This prevents repeated overload cycles by being progressively more conservative.
|
||||
|
||||
### Output Power Correlation Learning
|
||||
|
||||
The system learns that **higher output loads allow higher input current**.
|
||||
|
||||
**Why?** When external loads (AC output) are high, power flows directly through to loads. When loads are low, all input power goes to battery charging, stressing the inverter more.
|
||||
|
||||
**Model**: `max_input_current = base + (slope × output_power) + zone_offset`
|
||||
|
||||
|
||||
| Output Power | Zone | Offset | Example Max Input |
|
||||
| ------------ | ---- | ------ | ----------------- |
|
||||
| 0-2000W | LOW | -2A | 44 + 1 - 2 = 43A |
|
||||
| 2000-4000W | MED | 0A | 44 + 3 + 0 = 47A |
|
||||
| 4000-8000W | HIGH | +4A | 44 + 6 + 4 = 54A |
|
||||
|
||||
|
||||
The model learns from:
|
||||
|
||||
- **Overload events**: Strong signal that limit was too high for that output level
|
||||
- **Stable operation**: Confirms current limit is safe at that output level
|
||||
|
||||
Model parameters persist to `/data/dbus-generator-ramp/learned_model.json`.
|
||||
|
||||
### State Machine
|
||||
|
||||
|
||||
| State | Description |
|
||||
| -------- | -------------------------------------------------------- |
|
||||
| IDLE | Waiting for generator to start |
|
||||
| WARMUP | Generator warming up, AC not connected, limit set to 40A |
|
||||
| RAMPING | Gradually increasing current from 40A to target |
|
||||
| COOLDOWN | Waiting at 40A after overload (5 minutes) |
|
||||
| RECOVERY | Fast ramp then slow ramp back up after overload |
|
||||
| STABLE | At target, monitoring for overload |
|
||||
|
||||
|
||||
### Overload Detection
|
||||
|
||||
The detector uses two methods that must both agree:
|
||||
|
||||
1. **Rate-of-Change Reversals**: Counts rapid sign changes in power derivative
|
||||
2. **Detrended Standard Deviation**: Measures oscillation amplitude after removing smooth trends
|
||||
|
||||
This combination:
|
||||
|
||||
- ✅ Detects: Erratic oscillations (generator overload)
|
||||
- ❌ Ignores: Smooth load increases/decreases (normal operation)
|
||||
|
||||
## Installation
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Venus OS v3.10 or later (for warm-up/cool-down support)
|
||||
- VE.Bus firmware 415 or later
|
||||
- SSH/root access to the Cerbo GX
|
||||
|
||||
### Steps
|
||||
|
||||
1. **Copy files to Venus OS**:
|
||||
```bash
|
||||
# From your local machine
|
||||
scp -r dbus-generator-ramp root@<cerbo-ip>:/data/
|
||||
```
|
||||
2. **Make scripts executable**:
|
||||
```bash
|
||||
ssh root@<cerbo-ip>
|
||||
chmod +x /data/dbus-generator-ramp/service/run
|
||||
chmod +x /data/dbus-generator-ramp/service/log/run
|
||||
```
|
||||
3. **Create symlink to velib_python**:
|
||||
```bash
|
||||
ln -s /opt/victronenergy/velib_python /data/dbus-generator-ramp/ext/velib_python
|
||||
```
|
||||
4. **Create service symlink**:
|
||||
```bash
|
||||
ln -s /data/dbus-generator-ramp/service /opt/victronenergy/service/dbus-generator-ramp
|
||||
```
|
||||
5. **The service will start automatically** (managed by daemontools)
|
||||
|
||||
### Persistence Across Firmware Updates
|
||||
|
||||
Add to `/data/rc.local`:
|
||||
|
||||
```bash
|
||||
#!/bin/bash
|
||||
|
||||
# Restore generator ramp controller service link
|
||||
if [ ! -L /opt/victronenergy/service/dbus-generator-ramp ]; then
|
||||
ln -s /data/dbus-generator-ramp/service /opt/victronenergy/service/dbus-generator-ramp
|
||||
fi
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
Edit `/data/dbus-generator-ramp/config.py` to adjust:
|
||||
|
||||
### D-Bus Services
|
||||
|
||||
```python
|
||||
DBUS_CONFIG = {
|
||||
'vebus_service': 'com.victronenergy.vebus.ttyS4', # Your VE.Bus service
|
||||
'generator_service': 'com.victronenergy.generator.startstop0',
|
||||
}
|
||||
```
|
||||
|
||||
### Current Limits
|
||||
|
||||
```python
|
||||
RAMP_CONFIG = {
|
||||
'initial_current': 40.0, # Starting current (A)
|
||||
'target_current': 50.0, # Target current (A)
|
||||
'initial_ramp_rate': 0.333, # A/min (10A over 30 min)
|
||||
'recovery_ramp_rate': 0.5, # A/min during recovery
|
||||
'cooldown_duration': 300, # Seconds at 40A after overload
|
||||
|
||||
# Fast recovery settings
|
||||
'fast_recovery_margin': 4.0, # Amps below overload point
|
||||
'fast_ramp_rate': 5.0, # A/min during fast recovery
|
||||
'rapid_overload_threshold': 120, # Seconds - rapid if within this
|
||||
'rapid_overload_extra_margin': 2.0, # Extra margin per rapid overload
|
||||
}
|
||||
```
|
||||
|
||||
### Output Power Learning
|
||||
|
||||
```python
|
||||
LEARNING_CONFIG = {
|
||||
'enabled': True,
|
||||
'power_zones': {
|
||||
'LOW': (0, 2000, -2.0), # (min_W, max_W, offset_A)
|
||||
'MEDIUM': (2000, 4000, 0.0),
|
||||
'HIGH': (4000, 8000, 4.0),
|
||||
},
|
||||
'initial_base_current': 44.0, # Starting base for model
|
||||
'initial_slope': 0.001, # Amps per Watt
|
||||
'learning_rate': 0.1, # How fast model adapts
|
||||
'min_confidence': 5, # Data points before using model
|
||||
}
|
||||
```
|
||||
|
||||
### Overload Detection
|
||||
|
||||
```python
|
||||
OVERLOAD_CONFIG = {
|
||||
'derivative_threshold': 150, # Min power change to count (W)
|
||||
'reversal_threshold': 5, # Reversals to trigger
|
||||
'std_dev_threshold': 250, # Std dev threshold (W)
|
||||
'confirmation_threshold': 6, # Confirmations needed
|
||||
}
|
||||
```
|
||||
|
||||
## Monitoring
|
||||
|
||||
### D-Bus Paths
|
||||
|
||||
The service publishes status to `com.victronenergy.generatorramp`:
|
||||
|
||||
**Core Status**
|
||||
|
||||
|
||||
| Path | Description |
|
||||
| ----------------- | ----------------------------------------------------------------------------- |
|
||||
| `/State` | Current state (0=Idle, 1=Warmup, 2=Ramping, 3=Cooldown, 4=Recovery, 5=Stable) |
|
||||
| `/CurrentLimit` | Current input limit (A) |
|
||||
| `/TargetLimit` | Target for current ramp (A) |
|
||||
| `/RecoveryTarget` | Conservative target after overload (A) |
|
||||
| `/OverloadCount` | Total overloads this session |
|
||||
|
||||
|
||||
**Power Monitoring**
|
||||
|
||||
|
||||
| Path | Description |
|
||||
| ---------------------------------- | ------------------------ |
|
||||
| `/Power/L1`, `/L2`, `/Total` | Input power (W) |
|
||||
| `/OutputPower/L1`, `/L2`, `/Total` | Output power / loads (W) |
|
||||
|
||||
|
||||
**Fast Recovery Status**
|
||||
|
||||
|
||||
| Path | Description |
|
||||
| ------------------------------ | ----------------------------------- |
|
||||
| `/Recovery/InFastRamp` | 1 if currently in fast ramp phase |
|
||||
| `/Recovery/FastRampTarget` | Target for fast ramp (A) |
|
||||
| `/Recovery/RapidOverloadCount` | Count of rapid successive overloads |
|
||||
|
||||
|
||||
**Learning Model Status**
|
||||
|
||||
|
||||
| Path | Description |
|
||||
| --------------------------- | ---------------------------------------------- |
|
||||
| `/Learning/Confidence` | Model confidence (data points) |
|
||||
| `/Learning/ConfidenceLevel` | LOW, MEDIUM, or HIGH |
|
||||
| `/Learning/BaseCurrent` | Learned base current (A) |
|
||||
| `/Learning/SuggestedLimit` | Model's suggested limit for current output (A) |
|
||||
| `/Learning/DataPoints` | Number of stable operation data points |
|
||||
| `/Learning/OverloadPoints` | Number of recorded overload events |
|
||||
|
||||
|
||||
### View Logs
|
||||
|
||||
```bash
|
||||
# Live log
|
||||
tail -F /var/log/dbus-generator-ramp/current | tai64nlocal
|
||||
|
||||
# Recent log
|
||||
cat /var/log/dbus-generator-ramp/current | tai64nlocal
|
||||
```
|
||||
|
||||
### Service Control
|
||||
|
||||
```bash
|
||||
# Stop service
|
||||
svc -d /service/dbus-generator-ramp
|
||||
|
||||
# Start service
|
||||
svc -u /service/dbus-generator-ramp
|
||||
|
||||
# Restart service
|
||||
svc -t /service/dbus-generator-ramp
|
||||
|
||||
# Check status
|
||||
svstat /service/dbus-generator-ramp
|
||||
```
|
||||
|
||||
### Manual Testing
|
||||
|
||||
```bash
|
||||
# Check generator state
|
||||
dbus -y com.victronenergy.generator.startstop0 /State GetValue
|
||||
|
||||
# Check current limit
|
||||
dbus -y com.victronenergy.vebus.ttyS4 /Ac/In/1/CurrentLimit GetValue
|
||||
|
||||
# Check input power
|
||||
dbus -y com.victronenergy.vebus.ttyS4 /Ac/ActiveIn/L1/P GetValue
|
||||
dbus -y com.victronenergy.vebus.ttyS4 /Ac/ActiveIn/L2/P GetValue
|
||||
|
||||
# Check output power (loads)
|
||||
dbus -y com.victronenergy.vebus.ttyS4 /Ac/Out/L1/P GetValue
|
||||
dbus -y com.victronenergy.vebus.ttyS4 /Ac/Out/L2/P GetValue
|
||||
|
||||
# Check learning model status
|
||||
dbus -y com.victronenergy.generatorramp /Learning/SuggestedLimit GetValue
|
||||
dbus -y com.victronenergy.generatorramp /Learning/Confidence GetValue
|
||||
```
|
||||
|
||||
## Native GUI Integration (Optional)
|
||||
|
||||
Add a menu item to the Cerbo's built-in GUI:
|
||||
|
||||
```bash
|
||||
# Install GUI modifications
|
||||
./install_gui.sh
|
||||
|
||||
# Remove GUI modifications
|
||||
./install_gui.sh --remove
|
||||
```
|
||||
|
||||
This adds **Settings → Generator start/stop → Dynamic current ramp** with:
|
||||
|
||||
- Enable/disable toggle
|
||||
- Current status display
|
||||
- All configurable settings
|
||||
|
||||
**Note**: GUI changes are lost on firmware update. Run `./install_gui.sh` again after updating.
|
||||
|
||||
## Web UI (Optional)
|
||||
|
||||
A browser-based interface at `http://<cerbo-ip>:8088`:
|
||||
|
||||
```bash
|
||||
# Install with web UI
|
||||
./install.sh --webui
|
||||
```
|
||||
|
||||
## Tuning
|
||||
|
||||
### Overload Detection
|
||||
|
||||
The default thresholds may need adjustment for your specific generator:
|
||||
|
||||
1. **Collect Data**: Log power readings during normal operation and overload
|
||||
```bash
|
||||
# Add to config.py: LOGGING_CONFIG['level'] = 'DEBUG'
|
||||
```
|
||||
2. **Analyze Patterns**:
|
||||
- Normal: Low std dev, few reversals
|
||||
- Overload: High std dev, many reversals
|
||||
3. **Adjust Thresholds**:
|
||||
- If false positives: Increase thresholds
|
||||
- If missed overloads: Decrease thresholds
|
||||
|
||||
### Learning Model
|
||||
|
||||
To tune the power correlation model based on your system:
|
||||
|
||||
1. **Observe stable operation** at different output power levels
|
||||
2. **Record what input current works** at each level
|
||||
3. **Adjust `LEARNING_CONFIG`**:
|
||||
- `initial_base_current`: Max input at 0W output
|
||||
- `initial_slope`: How much more input per Watt of output
|
||||
- `power_zones`: Offsets for different load ranges
|
||||
|
||||
**Example tuning**: If you can run 54A stable at 6kW output:
|
||||
|
||||
```
|
||||
54 = base + (0.001 × 6000) + 4 (HIGH zone)
|
||||
54 = base + 6 + 4
|
||||
base = 44A
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Service Won't Start
|
||||
|
||||
```bash
|
||||
# Check for Python errors
|
||||
python3 /data/dbus-generator-ramp/dbus-generator-ramp.py
|
||||
|
||||
# Check D-Bus services exist
|
||||
dbus -y | grep vebus
|
||||
dbus -y | grep generator
|
||||
```
|
||||
|
||||
### Current Limit Not Changing
|
||||
|
||||
```bash
|
||||
# Check if adjustable
|
||||
dbus -y com.victronenergy.vebus.ttyS4 /Ac/In/1/CurrentLimitIsAdjustable GetValue
|
||||
# Should return 1
|
||||
```
|
||||
|
||||
### Generator State Not Detected
|
||||
|
||||
```bash
|
||||
# Monitor generator state changes
|
||||
watch -n 1 'dbus -y com.victronenergy.generator.startstop0 /State GetValue'
|
||||
```
|
||||
|
||||
### Reset Learning Model
|
||||
|
||||
```bash
|
||||
# Delete the learned model to start fresh
|
||||
rm /data/dbus-generator-ramp/learned_model.json
|
||||
svc -t /service/dbus-generator-ramp
|
||||
```
|
||||
|
||||
## Development
|
||||
|
||||
### Local Testing (without Venus OS)
|
||||
|
||||
```bash
|
||||
# Build development environment
|
||||
docker-compose build
|
||||
|
||||
# Run component tests
|
||||
docker-compose run --rm dev python overload_detector.py
|
||||
docker-compose run --rm dev python ramp_controller.py
|
||||
|
||||
# Interactive shell
|
||||
docker-compose run --rm dev bash
|
||||
```
|
||||
|
||||
### Running Tests
|
||||
|
||||
```bash
|
||||
python3 overload_detector.py # Runs built-in tests
|
||||
python3 ramp_controller.py # Runs built-in tests (includes fast recovery)
|
||||
```
|
||||
|
||||
## File Structure
|
||||
|
||||
```
|
||||
/data/dbus-generator-ramp/
|
||||
├── dbus-generator-ramp.py # Main application
|
||||
├── config.py # Configuration
|
||||
├── overload_detector.py # Power fluctuation analysis
|
||||
├── ramp_controller.py # Current ramping + learning model
|
||||
├── learned_model.json # Persisted learning data (created at runtime)
|
||||
├── service/
|
||||
│ ├── run # daemontools run script
|
||||
│ └── log/
|
||||
│ └── run # multilog script
|
||||
├── ext/
|
||||
│ └── velib_python/ # Symlink to Venus library
|
||||
├── Dockerfile # Development environment
|
||||
├── docker-compose.yml # Development compose
|
||||
└── README.md # This file
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
MIT License - See LICENSE file
|
||||
|
||||
## Acknowledgments
|
||||
|
||||
- Victron Energy for Venus OS and the excellent D-Bus API
|
||||
- The Victron Community modifications forum
|
||||
|
||||
190
dbus-generator-ramp/build-package.sh
Executable file
190
dbus-generator-ramp/build-package.sh
Executable file
@@ -0,0 +1,190 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# Build script for Generator Current Ramp Controller
|
||||
#
|
||||
# Creates a tar.gz package that can be:
|
||||
# 1. Copied to a CerboGX device
|
||||
# 2. Untarred to /data/
|
||||
# 3. Installed by running install.sh
|
||||
#
|
||||
# Usage:
|
||||
# ./build-package.sh # Creates package with default name
|
||||
# ./build-package.sh --version 1.0.0 # Creates package with version in name
|
||||
# ./build-package.sh --output /path/ # Specify output directory
|
||||
#
|
||||
# Output: dbus-generator-ramp-<version>.tar.gz
|
||||
#
|
||||
# Installation on CerboGX:
|
||||
# scp dbus-generator-ramp-*.tar.gz root@<cerbo-ip>:/data/
|
||||
# ssh root@<cerbo-ip>
|
||||
# cd /data
|
||||
# tar -xzf dbus-generator-ramp-*.tar.gz
|
||||
# cd dbus-generator-ramp
|
||||
# ./install.sh [--webui]
|
||||
#
|
||||
|
||||
set -e
|
||||
|
||||
# Script directory (where the source files are)
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
# Default values
|
||||
VERSION="1.0.0"
|
||||
OUTPUT_DIR="$SCRIPT_DIR"
|
||||
PACKAGE_NAME="dbus-generator-ramp"
|
||||
|
||||
# Parse arguments
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case $1 in
|
||||
--version|-v)
|
||||
VERSION="$2"
|
||||
shift 2
|
||||
;;
|
||||
--output|-o)
|
||||
OUTPUT_DIR="$2"
|
||||
shift 2
|
||||
;;
|
||||
--help|-h)
|
||||
echo "Usage: $0 [OPTIONS]"
|
||||
echo ""
|
||||
echo "Options:"
|
||||
echo " -v, --version VERSION Set package version (default: 1.0.0)"
|
||||
echo " -o, --output PATH Output directory (default: script directory)"
|
||||
echo " -h, --help Show this help message"
|
||||
echo ""
|
||||
echo "Example:"
|
||||
echo " $0 --version 1.2.0 --output ./dist/"
|
||||
exit 0
|
||||
;;
|
||||
*)
|
||||
echo "Unknown option: $1"
|
||||
echo "Use --help for usage information"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
# Build timestamp
|
||||
BUILD_DATE=$(date -u +"%Y-%m-%d %H:%M:%S UTC")
|
||||
BUILD_TIMESTAMP=$(date +%Y%m%d%H%M%S)
|
||||
|
||||
# Temporary build directory
|
||||
BUILD_DIR=$(mktemp -d)
|
||||
PACKAGE_DIR="$BUILD_DIR/$PACKAGE_NAME"
|
||||
|
||||
echo "=================================================="
|
||||
echo "Building $PACKAGE_NAME package"
|
||||
echo "=================================================="
|
||||
echo "Version: $VERSION"
|
||||
echo "Build date: $BUILD_DATE"
|
||||
echo "Source: $SCRIPT_DIR"
|
||||
echo "Output: $OUTPUT_DIR"
|
||||
echo ""
|
||||
|
||||
# Create package directory structure
|
||||
echo "1. Creating package structure..."
|
||||
mkdir -p "$PACKAGE_DIR"
|
||||
mkdir -p "$PACKAGE_DIR/service/log"
|
||||
mkdir -p "$PACKAGE_DIR/service-webui/log"
|
||||
mkdir -p "$PACKAGE_DIR/qml"
|
||||
|
||||
# Copy main Python files
|
||||
echo "2. Copying application files..."
|
||||
cp "$SCRIPT_DIR/dbus-generator-ramp.py" "$PACKAGE_DIR/"
|
||||
cp "$SCRIPT_DIR/config.py" "$PACKAGE_DIR/"
|
||||
cp "$SCRIPT_DIR/ramp_controller.py" "$PACKAGE_DIR/"
|
||||
cp "$SCRIPT_DIR/overload_detector.py" "$PACKAGE_DIR/"
|
||||
cp "$SCRIPT_DIR/web_ui.py" "$PACKAGE_DIR/"
|
||||
|
||||
# Copy service files
|
||||
echo "3. Copying service files..."
|
||||
cp "$SCRIPT_DIR/service/run" "$PACKAGE_DIR/service/"
|
||||
cp "$SCRIPT_DIR/service/log/run" "$PACKAGE_DIR/service/log/"
|
||||
cp "$SCRIPT_DIR/service-webui/run" "$PACKAGE_DIR/service-webui/"
|
||||
cp "$SCRIPT_DIR/service-webui/log/run" "$PACKAGE_DIR/service-webui/log/"
|
||||
|
||||
# Copy QML files
|
||||
echo "4. Copying GUI files..."
|
||||
cp "$SCRIPT_DIR/qml/PageSettingsGeneratorRamp.qml" "$PACKAGE_DIR/qml/"
|
||||
|
||||
# Copy installation scripts
|
||||
echo "5. Copying installation scripts..."
|
||||
cp "$SCRIPT_DIR/install.sh" "$PACKAGE_DIR/"
|
||||
cp "$SCRIPT_DIR/uninstall.sh" "$PACKAGE_DIR/"
|
||||
cp "$SCRIPT_DIR/install_gui.sh" "$PACKAGE_DIR/"
|
||||
|
||||
# Copy documentation
|
||||
echo "6. Copying documentation..."
|
||||
cp "$SCRIPT_DIR/README.md" "$PACKAGE_DIR/"
|
||||
|
||||
# Create version file with build info
|
||||
echo "7. Creating version info..."
|
||||
cat > "$PACKAGE_DIR/VERSION" << EOF
|
||||
Package: $PACKAGE_NAME
|
||||
Version: $VERSION
|
||||
Build Date: $BUILD_DATE
|
||||
Build Timestamp: $BUILD_TIMESTAMP
|
||||
|
||||
Installation:
|
||||
1. Copy this package to CerboGX: scp $PACKAGE_NAME-$VERSION.tar.gz root@<cerbo-ip>:/data/
|
||||
2. SSH to CerboGX: ssh root@<cerbo-ip>
|
||||
3. Extract: cd /data && tar -xzf $PACKAGE_NAME-$VERSION.tar.gz
|
||||
4. Install: cd $PACKAGE_NAME && ./install.sh [--webui]
|
||||
|
||||
For more information, see README.md
|
||||
EOF
|
||||
|
||||
# Set executable permissions
|
||||
echo "8. Setting permissions..."
|
||||
chmod +x "$PACKAGE_DIR/dbus-generator-ramp.py"
|
||||
chmod +x "$PACKAGE_DIR/web_ui.py"
|
||||
chmod +x "$PACKAGE_DIR/install.sh"
|
||||
chmod +x "$PACKAGE_DIR/uninstall.sh"
|
||||
chmod +x "$PACKAGE_DIR/install_gui.sh"
|
||||
chmod +x "$PACKAGE_DIR/service/run"
|
||||
chmod +x "$PACKAGE_DIR/service/log/run"
|
||||
chmod +x "$PACKAGE_DIR/service-webui/run"
|
||||
chmod +x "$PACKAGE_DIR/service-webui/log/run"
|
||||
|
||||
# Create output directory if needed
|
||||
mkdir -p "$OUTPUT_DIR"
|
||||
|
||||
# Create the tar.gz package
|
||||
TARBALL_NAME="$PACKAGE_NAME-$VERSION.tar.gz"
|
||||
TARBALL_PATH="$OUTPUT_DIR/$TARBALL_NAME"
|
||||
|
||||
echo "9. Creating package archive..."
|
||||
cd "$BUILD_DIR"
|
||||
tar -czf "$TARBALL_PATH" "$PACKAGE_NAME"
|
||||
|
||||
# Calculate checksum
|
||||
CHECKSUM=$(sha256sum "$TARBALL_PATH" | cut -d' ' -f1)
|
||||
|
||||
# Create checksum file
|
||||
echo "$CHECKSUM $TARBALL_NAME" > "$OUTPUT_DIR/$TARBALL_NAME.sha256"
|
||||
|
||||
# Clean up
|
||||
echo "10. Cleaning up..."
|
||||
rm -rf "$BUILD_DIR"
|
||||
|
||||
# Get file size
|
||||
FILE_SIZE=$(du -h "$TARBALL_PATH" | cut -f1)
|
||||
|
||||
echo ""
|
||||
echo "=================================================="
|
||||
echo "Build complete!"
|
||||
echo "=================================================="
|
||||
echo ""
|
||||
echo "Package: $TARBALL_PATH"
|
||||
echo "Size: $FILE_SIZE"
|
||||
echo "SHA256: $CHECKSUM"
|
||||
echo ""
|
||||
echo "Installation on CerboGX:"
|
||||
echo " scp $TARBALL_PATH root@<cerbo-ip>:/data/"
|
||||
echo " ssh root@<cerbo-ip>"
|
||||
echo " cd /data"
|
||||
echo " tar -xzf $TARBALL_NAME"
|
||||
echo " cd $PACKAGE_NAME"
|
||||
echo " ./install.sh # Main service only"
|
||||
echo " ./install.sh --webui # With web UI"
|
||||
echo ""
|
||||
261
dbus-generator-ramp/config.py
Normal file
261
dbus-generator-ramp/config.py
Normal file
@@ -0,0 +1,261 @@
|
||||
"""
|
||||
Configuration for Generator Current Ramp Controller
|
||||
|
||||
All tunable parameters in one place for easy adjustment.
|
||||
"""
|
||||
|
||||
# =============================================================================
|
||||
# D-BUS SERVICE CONFIGURATION
|
||||
# =============================================================================
|
||||
|
||||
DBUS_CONFIG = {
|
||||
# VE.Bus inverter/charger service
|
||||
'vebus_service': 'com.victronenergy.vebus.ttyS4',
|
||||
|
||||
# Generator start/stop service
|
||||
'generator_service': 'com.victronenergy.generator.startstop0',
|
||||
|
||||
# Which AC input the generator is connected to (1 or 2)
|
||||
'generator_ac_input': 1,
|
||||
}
|
||||
|
||||
# =============================================================================
|
||||
# GENERATOR STATE VALUES
|
||||
# =============================================================================
|
||||
|
||||
GENERATOR_STATE = {
|
||||
'STOPPED': 0,
|
||||
'RUNNING': 1,
|
||||
'WARMUP': 2,
|
||||
'COOLDOWN': 3,
|
||||
'ERROR': 10,
|
||||
}
|
||||
|
||||
# =============================================================================
|
||||
# CURRENT LIMIT CONFIGURATION
|
||||
# =============================================================================
|
||||
|
||||
RAMP_CONFIG = {
|
||||
# Starting current limit after generator warm-up (Amps)
|
||||
'initial_current': 40.0,
|
||||
|
||||
# Target current limit to ramp up to (Amps)
|
||||
# This is the max achievable at high output loads (2500-5000W)
|
||||
# Actual limit is constrained by PowerCorrelationModel based on output power
|
||||
'target_current': 54.0,
|
||||
|
||||
# Absolute minimum current limit - safety floor (Amps)
|
||||
'minimum_current': 30.0,
|
||||
|
||||
# Maximum current limit - will be read from inverter but this is a sanity check
|
||||
'maximum_current': 100.0,
|
||||
|
||||
# Initial ramp rate: Amps per minute
|
||||
# 10A over 30 minutes = 0.333 A/min
|
||||
'initial_ramp_rate': 0.333,
|
||||
|
||||
# Recovery ramp rate after overload: Amps per minute (faster)
|
||||
'recovery_ramp_rate': 0.5,
|
||||
|
||||
# How long to wait at initial_current after overload before ramping again (seconds)
|
||||
'cooldown_duration': 300, # 5 minutes
|
||||
|
||||
# Safety margin below last stable point for recovery target (Amps)
|
||||
'recovery_margin': 2.0,
|
||||
|
||||
# Additional margin per overload event (Amps)
|
||||
'margin_per_overload': 2.0,
|
||||
|
||||
# Time stable at target before clearing overload history (seconds)
|
||||
# Allows system to attempt full power again after extended stable operation
|
||||
'history_clear_time': 1800, # 30 minutes
|
||||
|
||||
# How often to update the current limit (seconds)
|
||||
# More frequent = smoother ramp, but more D-Bus writes
|
||||
'ramp_update_interval': 30,
|
||||
|
||||
# --- Fast Recovery Settings ---
|
||||
# When recovering, fast-ramp to (overload_current - fast_recovery_margin)
|
||||
# then slow-ramp from there to the target
|
||||
'fast_recovery_margin': 4.0, # Amps below overload point
|
||||
|
||||
# If overload occurs again within this time, use larger margin
|
||||
'rapid_overload_threshold': 120, # 2 minutes
|
||||
|
||||
# Additional margin when overloads happen in rapid succession
|
||||
'rapid_overload_extra_margin': 2.0, # Added to fast_recovery_margin
|
||||
|
||||
# Fast ramp rate (Amps per minute) - much faster than normal
|
||||
'fast_ramp_rate': 5.0, # 5A per minute = 12 seconds per amp
|
||||
|
||||
# --- Return to prior stable current after overload ---
|
||||
# If we were stable at current limit for this long, assume overload was
|
||||
# e.g. a step load (e.g. heater switched on) and recovery target = that current.
|
||||
'return_to_stable_after_overload': True,
|
||||
'return_to_stable_min_duration': 1800, # Seconds (30 minutes) stable before returning to that current
|
||||
}
|
||||
|
||||
# =============================================================================
|
||||
# OUTPUT POWER CORRELATION LEARNING
|
||||
# =============================================================================
|
||||
|
||||
# Output power determines how much generator power smoothing is applied by the
|
||||
# inverter/charger. Higher output loads pass through more directly, allowing
|
||||
# higher input current limits. Lower output loads stress the inverter more
|
||||
# (all input goes to battery charging), requiring lower input limits.
|
||||
#
|
||||
# Key insight: Quick INPUT power fluctuations indicate overload/instability
|
||||
# regardless of output load level - these should always trigger detection.
|
||||
|
||||
LEARNING_CONFIG = {
|
||||
# Enable/disable learning system
|
||||
'enabled': True,
|
||||
|
||||
# How many data points to keep for correlation analysis
|
||||
'max_data_points': 100,
|
||||
|
||||
# Minimum stable time before recording a data point (seconds)
|
||||
'min_stable_time': 60,
|
||||
|
||||
# Output power zones for adaptive current limits
|
||||
# Based on observed relationship between output load and achievable input current:
|
||||
# - Low output (0-1500W): Inverter does most smoothing, ~45A achievable
|
||||
# - Medium output (1500-2500W): Transitional zone, ~49A achievable
|
||||
# - High output (2500-5000W): Direct passthrough, ~54A achievable
|
||||
'power_zones': {
|
||||
# (min_output_watts, max_output_watts): max_input_current_offset
|
||||
# Offset is added to base learned limit (45A)
|
||||
'LOW': (0, 1500, 0.0), # Low loads: 45A limit
|
||||
'MEDIUM': (1500, 2500, 4.0), # Transitional: 49A limit
|
||||
'HIGH': (2500, 5000, 9.0), # High loads: 54A limit
|
||||
'VERY_HIGH': (5000, 10000, 9.0), # Very high: cap at 54A (same as HIGH)
|
||||
},
|
||||
|
||||
# Initial model parameters (linear: max_current = base + slope * output_power)
|
||||
# Base current is achievable at LOW output (0-1500W)
|
||||
# Slope is 0 because we use discrete zones for step changes
|
||||
'initial_base_current': 45.0,
|
||||
'initial_slope': 0.0, # Use zones for discrete steps instead of continuous slope
|
||||
|
||||
# Learning rate for updating model (0-1, higher = faster adaptation)
|
||||
'learning_rate': 0.1,
|
||||
|
||||
# Minimum confidence (data points) before using learned model
|
||||
'min_confidence': 5,
|
||||
|
||||
# --- Output Power Change Detection ---
|
||||
# If output power increases by this much, re-evaluate recovery target
|
||||
# Higher output = more power passes through to loads = less inverter stress
|
||||
'output_power_increase_threshold': 2000, # Watts
|
||||
|
||||
# Minimum time between target re-evaluations (prevents rapid changes)
|
||||
'min_reevaluation_interval': 60, # Seconds
|
||||
}
|
||||
|
||||
# =============================================================================
|
||||
# OVERLOAD DETECTION CONFIGURATION
|
||||
# =============================================================================
|
||||
|
||||
# CRITICAL: Overload/instability is detected via INPUT power fluctuations.
|
||||
# Quick fluctuations on INPUT (not output) indicate generator stress.
|
||||
# Output power only affects the achievable input current limit, NOT detection.
|
||||
# Any quick INPUT fluctuations at ANY output load level should trigger detection.
|
||||
|
||||
OVERLOAD_CONFIG = {
|
||||
# Sampling interval for power readings (milliseconds)
|
||||
'sample_interval_ms': 500,
|
||||
|
||||
# --- Method 1: Rate of Change Reversal Detection ---
|
||||
# Detects rapid oscillations in INPUT power (sign changes in derivative)
|
||||
# Minimum power change to be considered significant (Watts)
|
||||
'derivative_threshold': 150,
|
||||
|
||||
# Number of significant reversals in window to trigger
|
||||
'reversal_threshold': 5,
|
||||
|
||||
# Window size for reversal counting (samples)
|
||||
'reversal_window_size': 20, # 10 seconds at 500ms
|
||||
|
||||
# --- Method 2: Detrended Standard Deviation ---
|
||||
# Detects erratic INPUT power fluctuations after removing trend
|
||||
# Standard deviation threshold after removing linear trend (Watts)
|
||||
# Note: 400W threshold avoids false positives during normal ramp settling
|
||||
'std_dev_threshold': 400,
|
||||
|
||||
# Window size for std dev calculation (samples)
|
||||
'std_dev_window_size': 40, # 20 seconds at 500ms
|
||||
|
||||
# --- Confirmation (prevents false positives) ---
|
||||
# How many samples to consider for confirmation
|
||||
'confirmation_window': 10,
|
||||
|
||||
# How many positive detections needed to confirm overload
|
||||
# Note: 7/10 threshold requires more persistent detection to reduce false positives
|
||||
'confirmation_threshold': 7,
|
||||
|
||||
# --- Lockout after detection ---
|
||||
# Prevent re-detection for this many seconds after triggering
|
||||
'lockout_duration': 10,
|
||||
|
||||
# --- Ramp Start Grace Period ---
|
||||
# After ramp starts, suppress detection for this many seconds
|
||||
# Allows system to settle without triggering false overload detection
|
||||
'ramp_start_grace_period': 30,
|
||||
|
||||
# --- Minimum Output Power Requirement ---
|
||||
# IMPORTANT: Set to 0 to detect fluctuations at ANY output load level
|
||||
# Quick INPUT fluctuations indicate overload/instability regardless of output
|
||||
# Output power only affects achievable limit, not detection sensitivity
|
||||
'min_output_power_for_overload': 0, # CRITICAL: Must be 0 - detect at any load
|
||||
|
||||
# --- Trend Direction Check ---
|
||||
# If power is trending DOWN faster than this rate, ignore as load drop (not overload)
|
||||
# A true overload has power oscillating at/near ceiling, not dropping
|
||||
# Watts per second - negative values mean power is dropping
|
||||
'trend_drop_threshold': -100, # -100W/s = ignore if dropping faster than this
|
||||
|
||||
# --- Input Power Drop Check ---
|
||||
# If INPUT power is significantly below recent maximum, it's not an overload
|
||||
# True overloads oscillate AT/NEAR a ceiling - not dropping away from it
|
||||
# This catches step-change load drops that trend detection might miss
|
||||
# Note: We check INPUT power (generator load), NOT output power
|
||||
# Output power is unreliable because charger can absorb freed capacity
|
||||
# (e.g., load turns off but charger takes over - input stays high)
|
||||
'max_power_drop_for_overload': 1000, # Watts - ignore if INPUT dropped more than this
|
||||
|
||||
# --- Smoothing for trend detection (optional) ---
|
||||
# If true, apply EMA smoothing before derivative calculation
|
||||
'use_smoothing': False,
|
||||
'smoothing_alpha': 0.3,
|
||||
}
|
||||
|
||||
# =============================================================================
|
||||
# LOGGING CONFIGURATION
|
||||
# =============================================================================
|
||||
|
||||
LOGGING_CONFIG = {
|
||||
# Log level: DEBUG, INFO, WARNING, ERROR
|
||||
'level': 'INFO',
|
||||
|
||||
# Log to console (stdout)
|
||||
'console': True,
|
||||
|
||||
# Log file path (set to None to disable file logging)
|
||||
'file_path': None, # Venus uses multilog, so we log to stdout
|
||||
|
||||
# Include timestamps in console output (multilog adds its own)
|
||||
'include_timestamp': False,
|
||||
}
|
||||
|
||||
# =============================================================================
|
||||
# MAIN LOOP TIMING
|
||||
# =============================================================================
|
||||
|
||||
TIMING_CONFIG = {
|
||||
# Main loop interval (milliseconds)
|
||||
# This controls how often we check state and update
|
||||
'main_loop_interval_ms': 500,
|
||||
|
||||
# Timeout for D-Bus operations (seconds)
|
||||
'dbus_timeout': 5,
|
||||
}
|
||||
967
dbus-generator-ramp/dbus-generator-ramp.py
Executable file
967
dbus-generator-ramp/dbus-generator-ramp.py
Executable file
@@ -0,0 +1,967 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Venus OS Generator Current Ramp Controller
|
||||
|
||||
Monitors generator operation and dynamically adjusts inverter/charger
|
||||
input current limit to prevent generator overload.
|
||||
|
||||
This version publishes its own D-Bus service so status is visible via MQTT:
|
||||
N/<vrm-id>/generatorramp/0/...
|
||||
|
||||
Features:
|
||||
- Preemptively sets 40A limit when generator enters warm-up
|
||||
- Ramps from 40A to 50A over 30 minutes after AC connects
|
||||
- Detects generator overload via power fluctuation analysis
|
||||
- Rolls back to 40A on overload, then conservatively ramps back up
|
||||
- Publishes status to D-Bus/MQTT for monitoring
|
||||
- Settings adjustable via D-Bus/MQTT
|
||||
|
||||
Author: Claude (Anthropic)
|
||||
License: MIT
|
||||
"""
|
||||
|
||||
import sys
|
||||
import os
|
||||
import logging
|
||||
import signal
|
||||
from time import time, sleep
|
||||
|
||||
# Add velib_python to path (Venus OS standard location)
|
||||
sys.path.insert(1, os.path.join(os.path.dirname(__file__), 'ext', 'velib_python'))
|
||||
sys.path.insert(1, '/opt/victronenergy/velib_python')
|
||||
|
||||
try:
|
||||
from gi.repository import GLib
|
||||
except ImportError:
|
||||
print("ERROR: GLib not available. This script must run on Venus OS.")
|
||||
sys.exit(1)
|
||||
|
||||
try:
|
||||
import dbus
|
||||
from dbus.mainloop.glib import DBusGMainLoop
|
||||
from vedbus import VeDbusService
|
||||
from settingsdevice import SettingsDevice
|
||||
except ImportError as e:
|
||||
print(f"ERROR: Required module not available: {e}")
|
||||
print("This script must run on Venus OS.")
|
||||
sys.exit(1)
|
||||
|
||||
from config import (
|
||||
DBUS_CONFIG, GENERATOR_STATE, RAMP_CONFIG,
|
||||
OVERLOAD_CONFIG, LOGGING_CONFIG, TIMING_CONFIG, LEARNING_CONFIG
|
||||
)
|
||||
from overload_detector import OverloadDetector
|
||||
from ramp_controller import RampController
|
||||
|
||||
|
||||
# Version
|
||||
VERSION = '1.1.1'
|
||||
|
||||
# D-Bus service name for our addon
|
||||
SERVICE_NAME = 'com.victronenergy.generatorramp'
|
||||
|
||||
|
||||
class GeneratorRampController:
|
||||
"""
|
||||
Main controller that coordinates:
|
||||
- D-Bus monitoring of generator and inverter state
|
||||
- Overload detection from power readings
|
||||
- Current limit ramping
|
||||
- Publishing status to D-Bus (visible via MQTT)
|
||||
"""
|
||||
|
||||
# Controller states
|
||||
STATE_IDLE = 0
|
||||
STATE_WARMUP = 1
|
||||
STATE_RAMPING = 2
|
||||
STATE_COOLDOWN = 3
|
||||
STATE_RECOVERY = 4
|
||||
STATE_STABLE = 5
|
||||
|
||||
STATE_NAMES = {
|
||||
0: 'Idle',
|
||||
1: 'Warm-up',
|
||||
2: 'Ramping',
|
||||
3: 'Cooldown',
|
||||
4: 'Recovery',
|
||||
5: 'Stable',
|
||||
}
|
||||
|
||||
def __init__(self):
|
||||
self._setup_logging()
|
||||
self.logger = logging.getLogger('GenRampCtrl')
|
||||
self.logger.info(f"Initializing Generator Ramp Controller v{VERSION}")
|
||||
|
||||
# Components
|
||||
self.overload_detector = OverloadDetector(OVERLOAD_CONFIG)
|
||||
self.ramp_controller = RampController(RAMP_CONFIG)
|
||||
|
||||
# State
|
||||
self.state = self.STATE_IDLE
|
||||
self.state_enter_time = time()
|
||||
|
||||
# Cached values from D-Bus
|
||||
self.generator_state = GENERATOR_STATE['STOPPED']
|
||||
self.ac_connected = False
|
||||
self.current_l1_power = 0
|
||||
self.current_l2_power = 0
|
||||
self.current_limit_setting = 0
|
||||
|
||||
# Output power tracking (loads on inverter output)
|
||||
self.output_l1_power = 0
|
||||
self.output_l2_power = 0
|
||||
|
||||
# Enabled flag
|
||||
self.enabled = True
|
||||
|
||||
# D-Bus connection
|
||||
self.bus = dbus.SystemBus()
|
||||
|
||||
# Create our D-Bus service for publishing status
|
||||
self._create_dbus_service()
|
||||
|
||||
# Set up settings (stored in Venus localsettings)
|
||||
self._setup_settings()
|
||||
|
||||
# Connect to VE.Bus and Generator services
|
||||
self._init_dbus_monitors()
|
||||
|
||||
# Start main loop timer
|
||||
interval_ms = TIMING_CONFIG['main_loop_interval_ms']
|
||||
GLib.timeout_add(interval_ms, self._main_loop)
|
||||
|
||||
self.logger.info(f"Initialized. Main loop interval: {interval_ms}ms")
|
||||
|
||||
def _setup_logging(self):
|
||||
"""Configure logging based on config"""
|
||||
level = getattr(logging, LOGGING_CONFIG['level'], logging.INFO)
|
||||
|
||||
if LOGGING_CONFIG['include_timestamp']:
|
||||
fmt = '%(asctime)s %(levelname)s %(name)s: %(message)s'
|
||||
else:
|
||||
fmt = '%(levelname)s %(name)s: %(message)s'
|
||||
|
||||
logging.basicConfig(
|
||||
level=level,
|
||||
format=fmt,
|
||||
stream=sys.stdout
|
||||
)
|
||||
|
||||
def _create_dbus_service(self):
|
||||
"""Create our own D-Bus service for publishing status"""
|
||||
self.logger.info(f"Creating D-Bus service: {SERVICE_NAME}")
|
||||
|
||||
# Retry logic in case previous instance hasn't released the bus name yet
|
||||
max_retries = 5
|
||||
retry_delay = 1.0 # seconds
|
||||
|
||||
for attempt in range(max_retries):
|
||||
try:
|
||||
self.dbus_service = VeDbusService(SERVICE_NAME, self.bus)
|
||||
break # Success
|
||||
except dbus.exceptions.NameExistsException:
|
||||
if attempt < max_retries - 1:
|
||||
self.logger.warning(
|
||||
f"D-Bus name exists, retrying in {retry_delay}s "
|
||||
f"(attempt {attempt + 1}/{max_retries})"
|
||||
)
|
||||
sleep(retry_delay)
|
||||
retry_delay *= 2 # Exponential backoff
|
||||
else:
|
||||
self.logger.error("Failed to acquire D-Bus name after retries")
|
||||
raise
|
||||
|
||||
# Add management paths (required for Venus)
|
||||
self.dbus_service.add_path('/Mgmt/ProcessName', 'dbus-generator-ramp')
|
||||
self.dbus_service.add_path('/Mgmt/ProcessVersion', VERSION)
|
||||
self.dbus_service.add_path('/Mgmt/Connection', 'local')
|
||||
|
||||
# Add device info
|
||||
self.dbus_service.add_path('/DeviceInstance', 0)
|
||||
self.dbus_service.add_path('/ProductId', 0xFFFF)
|
||||
self.dbus_service.add_path('/ProductName', 'Generator Ramp Controller')
|
||||
self.dbus_service.add_path('/FirmwareVersion', VERSION)
|
||||
self.dbus_service.add_path('/Connected', 1)
|
||||
|
||||
# Status paths (read-only) - these will be visible in MQTT
|
||||
self.dbus_service.add_path('/State', self.STATE_IDLE,
|
||||
gettextcallback=lambda p, v: self.STATE_NAMES.get(v, 'Unknown'))
|
||||
self.dbus_service.add_path('/CurrentLimit', 0.0,
|
||||
gettextcallback=lambda p, v: f"{v:.1f}A")
|
||||
self.dbus_service.add_path('/TargetLimit', RAMP_CONFIG['target_current'],
|
||||
gettextcallback=lambda p, v: f"{v:.1f}A")
|
||||
self.dbus_service.add_path('/RecoveryTarget', RAMP_CONFIG['target_current'],
|
||||
gettextcallback=lambda p, v: f"{v:.1f}A")
|
||||
self.dbus_service.add_path('/OverloadCount', 0)
|
||||
self.dbus_service.add_path('/LastStableCurrent', RAMP_CONFIG['initial_current'],
|
||||
gettextcallback=lambda p, v: f"{v:.1f}A")
|
||||
|
||||
# Input power monitoring
|
||||
self.dbus_service.add_path('/Power/L1', 0.0,
|
||||
gettextcallback=lambda p, v: f"{v:.0f}W")
|
||||
self.dbus_service.add_path('/Power/L2', 0.0,
|
||||
gettextcallback=lambda p, v: f"{v:.0f}W")
|
||||
self.dbus_service.add_path('/Power/Total', 0.0,
|
||||
gettextcallback=lambda p, v: f"{v:.0f}W")
|
||||
|
||||
# Output power monitoring (loads on inverter output)
|
||||
self.dbus_service.add_path('/OutputPower/L1', 0.0,
|
||||
gettextcallback=lambda p, v: f"{v:.0f}W")
|
||||
self.dbus_service.add_path('/OutputPower/L2', 0.0,
|
||||
gettextcallback=lambda p, v: f"{v:.0f}W")
|
||||
self.dbus_service.add_path('/OutputPower/Total', 0.0,
|
||||
gettextcallback=lambda p, v: f"{v:.0f}W")
|
||||
|
||||
# Overload detection diagnostics
|
||||
self.dbus_service.add_path('/Detection/Reversals', 0)
|
||||
self.dbus_service.add_path('/Detection/StdDev', 0.0,
|
||||
gettextcallback=lambda p, v: f"{v:.1f}W")
|
||||
self.dbus_service.add_path('/Detection/IsOverload', 0)
|
||||
self.dbus_service.add_path('/Detection/OutputPowerOk', 1) # 1=sufficient load for detection
|
||||
self.dbus_service.add_path('/Detection/Trend', 0.0,
|
||||
gettextcallback=lambda p, v: f"{v:.1f}W/s")
|
||||
self.dbus_service.add_path('/Detection/TrendOk', 1) # 1=not dropping fast
|
||||
self.dbus_service.add_path('/Detection/PowerDrop', 0.0,
|
||||
gettextcallback=lambda p, v: f"{v:.0f}W")
|
||||
self.dbus_service.add_path('/Detection/PowerDropOk', 1) # 1=input not dropped from ceiling
|
||||
|
||||
# Generator and AC status (mirrors)
|
||||
self.dbus_service.add_path('/Generator/State', 0,
|
||||
gettextcallback=self._generator_state_text)
|
||||
self.dbus_service.add_path('/AcInput/Connected', 0)
|
||||
|
||||
# Ramp progress
|
||||
self.dbus_service.add_path('/Ramp/Progress', 0,
|
||||
gettextcallback=lambda p, v: f"{v}%")
|
||||
self.dbus_service.add_path('/Ramp/TimeRemaining', 0,
|
||||
gettextcallback=lambda p, v: f"{v//60}m {v%60}s" if v > 0 else "0s")
|
||||
|
||||
# Fast recovery status
|
||||
self.dbus_service.add_path('/Recovery/InFastRamp', 0)
|
||||
self.dbus_service.add_path('/Recovery/FastRampTarget', 0.0,
|
||||
gettextcallback=lambda p, v: f"{v:.1f}A")
|
||||
self.dbus_service.add_path('/Recovery/RapidOverloadCount', 0)
|
||||
|
||||
# Learning model status
|
||||
self.dbus_service.add_path('/Learning/Confidence', 0)
|
||||
self.dbus_service.add_path('/Learning/ConfidenceLevel', 'LOW')
|
||||
self.dbus_service.add_path('/Learning/BaseCurrent', LEARNING_CONFIG['initial_base_current'],
|
||||
gettextcallback=lambda p, v: f"{v:.1f}A")
|
||||
self.dbus_service.add_path('/Learning/SuggestedLimit', RAMP_CONFIG['target_current'],
|
||||
gettextcallback=lambda p, v: f"{v:.1f}A")
|
||||
self.dbus_service.add_path('/Learning/DataPoints', 0)
|
||||
self.dbus_service.add_path('/Learning/OverloadPoints', 0)
|
||||
|
||||
# Writable settings (can be changed via D-Bus/MQTT)
|
||||
self.dbus_service.add_path('/Settings/InitialCurrent',
|
||||
RAMP_CONFIG['initial_current'],
|
||||
writeable=True,
|
||||
onchangecallback=self._on_setting_changed,
|
||||
gettextcallback=lambda p, v: f"{v:.1f}A")
|
||||
self.dbus_service.add_path('/Settings/TargetCurrent',
|
||||
RAMP_CONFIG['target_current'],
|
||||
writeable=True,
|
||||
onchangecallback=self._on_setting_changed,
|
||||
gettextcallback=lambda p, v: f"{v:.1f}A")
|
||||
self.dbus_service.add_path('/Settings/RampDuration',
|
||||
30, # minutes
|
||||
writeable=True,
|
||||
onchangecallback=self._on_setting_changed,
|
||||
gettextcallback=lambda p, v: f"{v} min")
|
||||
self.dbus_service.add_path('/Settings/CooldownDuration',
|
||||
int(RAMP_CONFIG['cooldown_duration'] // 60),
|
||||
writeable=True,
|
||||
onchangecallback=self._on_setting_changed,
|
||||
gettextcallback=lambda p, v: f"{v} min")
|
||||
self.dbus_service.add_path('/Settings/Enabled',
|
||||
1,
|
||||
writeable=True,
|
||||
onchangecallback=self._on_setting_changed)
|
||||
|
||||
# Power zone settings (output-based input current limits)
|
||||
# These control how input current limit varies with output load
|
||||
self.dbus_service.add_path('/Settings/LowOutputLimit',
|
||||
LEARNING_CONFIG['initial_base_current'],
|
||||
writeable=True,
|
||||
onchangecallback=self._on_setting_changed,
|
||||
gettextcallback=lambda p, v: f"{v:.0f}A")
|
||||
self.dbus_service.add_path('/Settings/MediumOutputLimit',
|
||||
LEARNING_CONFIG['initial_base_current'] + LEARNING_CONFIG['power_zones']['MEDIUM'][2],
|
||||
writeable=True,
|
||||
onchangecallback=self._on_setting_changed,
|
||||
gettextcallback=lambda p, v: f"{v:.0f}A")
|
||||
self.dbus_service.add_path('/Settings/HighOutputLimit',
|
||||
LEARNING_CONFIG['initial_base_current'] + LEARNING_CONFIG['power_zones']['HIGH'][2],
|
||||
writeable=True,
|
||||
onchangecallback=self._on_setting_changed,
|
||||
gettextcallback=lambda p, v: f"{v:.0f}A")
|
||||
self.dbus_service.add_path('/Settings/LowOutputThreshold',
|
||||
LEARNING_CONFIG['power_zones']['LOW'][1],
|
||||
writeable=True,
|
||||
onchangecallback=self._on_setting_changed,
|
||||
gettextcallback=lambda p, v: f"{v:.0f}W")
|
||||
self.dbus_service.add_path('/Settings/HighOutputThreshold',
|
||||
LEARNING_CONFIG['power_zones']['MEDIUM'][1],
|
||||
writeable=True,
|
||||
onchangecallback=self._on_setting_changed,
|
||||
gettextcallback=lambda p, v: f"{v:.0f}W")
|
||||
|
||||
# Return to prior stable current after overload (if stable > N minutes)
|
||||
self.dbus_service.add_path('/Settings/ReturnToStableAfterOverload',
|
||||
1 if RAMP_CONFIG.get('return_to_stable_after_overload', True) else 0,
|
||||
writeable=True,
|
||||
onchangecallback=self._on_setting_changed)
|
||||
self.dbus_service.add_path('/Settings/ReturnToStableMinMinutes',
|
||||
int(RAMP_CONFIG.get('return_to_stable_min_duration', 1800) // 60),
|
||||
writeable=True,
|
||||
onchangecallback=self._on_setting_changed,
|
||||
gettextcallback=lambda p, v: f"{v} min")
|
||||
|
||||
self.logger.info("D-Bus service created")
|
||||
|
||||
def _generator_state_text(self, path, value):
|
||||
"""Get text for generator state"""
|
||||
states = {0: 'Stopped', 1: 'Running', 2: 'Warm-up', 3: 'Cool-down', 10: 'Error'}
|
||||
return states.get(value, f'Unknown ({value})')
|
||||
|
||||
def _on_setting_changed(self, path, value):
|
||||
"""Handle setting changes from D-Bus/MQTT"""
|
||||
self.logger.info(f"Setting changed: {path} = {value}")
|
||||
|
||||
if path == '/Settings/InitialCurrent':
|
||||
RAMP_CONFIG['initial_current'] = float(value)
|
||||
self._save_setting('InitialCurrent', float(value))
|
||||
|
||||
elif path == '/Settings/TargetCurrent':
|
||||
RAMP_CONFIG['target_current'] = float(value)
|
||||
self.dbus_service['/TargetLimit'] = float(value)
|
||||
self._save_setting('TargetCurrent', float(value))
|
||||
|
||||
elif path == '/Settings/RampDuration':
|
||||
duration_min = int(value)
|
||||
delta = RAMP_CONFIG['target_current'] - RAMP_CONFIG['initial_current']
|
||||
RAMP_CONFIG['initial_ramp_rate'] = delta / duration_min if duration_min > 0 else 0.333
|
||||
self._save_setting('RampDuration', duration_min)
|
||||
|
||||
elif path == '/Settings/CooldownDuration':
|
||||
RAMP_CONFIG['cooldown_duration'] = int(value) * 60
|
||||
self._save_setting('CooldownDuration', int(value))
|
||||
|
||||
elif path == '/Settings/Enabled':
|
||||
self.enabled = bool(value)
|
||||
self._save_setting('Enabled', int(value))
|
||||
if not self.enabled:
|
||||
self.logger.info("Controller disabled")
|
||||
self._transition_to(self.STATE_IDLE)
|
||||
|
||||
# Power zone settings
|
||||
elif path == '/Settings/LowOutputLimit':
|
||||
self._update_power_zones(low_limit=float(value))
|
||||
self._save_setting('LowOutputLimit', float(value))
|
||||
|
||||
elif path == '/Settings/MediumOutputLimit':
|
||||
self._update_power_zones(medium_limit=float(value))
|
||||
self._save_setting('MediumOutputLimit', float(value))
|
||||
|
||||
elif path == '/Settings/HighOutputLimit':
|
||||
self._update_power_zones(high_limit=float(value))
|
||||
self._save_setting('HighOutputLimit', float(value))
|
||||
|
||||
elif path == '/Settings/LowOutputThreshold':
|
||||
self._update_power_zones(low_threshold=int(value))
|
||||
self._save_setting('LowOutputThreshold', int(value))
|
||||
|
||||
elif path == '/Settings/HighOutputThreshold':
|
||||
self._update_power_zones(high_threshold=int(value))
|
||||
self._save_setting('HighOutputThreshold', int(value))
|
||||
|
||||
elif path == '/Settings/ReturnToStableAfterOverload':
|
||||
RAMP_CONFIG['return_to_stable_after_overload'] = bool(value)
|
||||
self._save_setting('ReturnToStableAfterOverload', int(value))
|
||||
|
||||
elif path == '/Settings/ReturnToStableMinMinutes':
|
||||
RAMP_CONFIG['return_to_stable_min_duration'] = int(value) * 60
|
||||
self._save_setting('ReturnToStableMinMinutes', int(value))
|
||||
|
||||
return True
|
||||
|
||||
def _save_setting(self, name, value):
|
||||
"""Save a setting to localsettings"""
|
||||
if self.settings:
|
||||
try:
|
||||
self.settings[name] = value
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Failed to save setting {name}: {e}")
|
||||
|
||||
def _update_power_zones(self, low_limit=None, medium_limit=None, high_limit=None,
|
||||
low_threshold=None, high_threshold=None):
|
||||
"""
|
||||
Update power zone configuration based on UI settings.
|
||||
|
||||
The UI exposes absolute current limits for each zone, but internally
|
||||
we store base_current and zone offsets. This method converts between them.
|
||||
"""
|
||||
# Get current values
|
||||
base = LEARNING_CONFIG['initial_base_current']
|
||||
zones = LEARNING_CONFIG['power_zones']
|
||||
|
||||
# Current absolute limits
|
||||
current_low = base + zones['LOW'][2]
|
||||
current_medium = base + zones['MEDIUM'][2]
|
||||
current_high = base + zones['HIGH'][2]
|
||||
|
||||
# Current thresholds
|
||||
current_low_thresh = zones['LOW'][1]
|
||||
current_high_thresh = zones['MEDIUM'][1]
|
||||
|
||||
# Apply changes
|
||||
if low_limit is not None:
|
||||
# Low limit becomes the new base
|
||||
new_base = float(low_limit)
|
||||
# Adjust other offsets to maintain their absolute values
|
||||
medium_offset = (medium_limit if medium_limit else current_medium) - new_base
|
||||
high_offset = (high_limit if high_limit else current_high) - new_base
|
||||
LEARNING_CONFIG['initial_base_current'] = new_base
|
||||
zones['LOW'] = (zones['LOW'][0], zones['LOW'][1], 0.0)
|
||||
zones['MEDIUM'] = (zones['MEDIUM'][0], zones['MEDIUM'][1], medium_offset)
|
||||
zones['HIGH'] = (zones['HIGH'][0], zones['HIGH'][1], high_offset)
|
||||
zones['VERY_HIGH'] = (zones['VERY_HIGH'][0], zones['VERY_HIGH'][1], high_offset)
|
||||
# Update the ramp controller's model
|
||||
self.ramp_controller.power_model.base_current = new_base
|
||||
self.logger.info(f"Power zones updated: base={new_base}A")
|
||||
|
||||
if medium_limit is not None and low_limit is None:
|
||||
medium_offset = float(medium_limit) - base
|
||||
zones['MEDIUM'] = (zones['MEDIUM'][0], zones['MEDIUM'][1], medium_offset)
|
||||
self.logger.info(f"Medium zone limit: {medium_limit}A (offset={medium_offset}A)")
|
||||
|
||||
if high_limit is not None and low_limit is None:
|
||||
high_offset = float(high_limit) - base
|
||||
zones['HIGH'] = (zones['HIGH'][0], zones['HIGH'][1], high_offset)
|
||||
zones['VERY_HIGH'] = (zones['VERY_HIGH'][0], zones['VERY_HIGH'][1], high_offset)
|
||||
self.logger.info(f"High zone limit: {high_limit}A (offset={high_offset}A)")
|
||||
|
||||
if low_threshold is not None:
|
||||
# Update LOW zone upper bound and MEDIUM zone lower bound
|
||||
zones['LOW'] = (0, int(low_threshold), zones['LOW'][2])
|
||||
zones['MEDIUM'] = (int(low_threshold), zones['MEDIUM'][1], zones['MEDIUM'][2])
|
||||
self.logger.info(f"Low/Medium threshold: {low_threshold}W")
|
||||
|
||||
if high_threshold is not None:
|
||||
# Update MEDIUM zone upper bound and HIGH zone lower bound
|
||||
zones['MEDIUM'] = (zones['MEDIUM'][0], int(high_threshold), zones['MEDIUM'][2])
|
||||
zones['HIGH'] = (int(high_threshold), zones['HIGH'][1], zones['HIGH'][2])
|
||||
self.logger.info(f"Medium/High threshold: {high_threshold}W")
|
||||
|
||||
def _setup_settings(self):
|
||||
"""Set up persistent settings via Venus localsettings"""
|
||||
self.settings = None
|
||||
try:
|
||||
settings_path = '/Settings/GeneratorRamp'
|
||||
|
||||
# Define settings with defaults [path, default, min, max]
|
||||
settings_def = {
|
||||
'InitialCurrent': [settings_path + '/InitialCurrent', 40.0, 10.0, 100.0],
|
||||
'TargetCurrent': [settings_path + '/TargetCurrent', 54.0, 10.0, 100.0],
|
||||
'RampDuration': [settings_path + '/RampDuration', 30, 1, 120],
|
||||
'CooldownDuration': [settings_path + '/CooldownDuration', 5, 1, 30],
|
||||
'Enabled': [settings_path + '/Enabled', 1, 0, 1],
|
||||
# Return to prior stable current after overload if stable > N min
|
||||
'ReturnToStableAfterOverload': [settings_path + '/ReturnToStableAfterOverload', 1, 0, 1],
|
||||
'ReturnToStableMinMinutes': [settings_path + '/ReturnToStableMinMinutes', 30, 5, 120],
|
||||
# Power zone settings (output-based input current limits)
|
||||
'LowOutputLimit': [settings_path + '/LowOutputLimit', 45.0, 30.0, 100.0],
|
||||
'MediumOutputLimit': [settings_path + '/MediumOutputLimit', 49.0, 30.0, 100.0],
|
||||
'HighOutputLimit': [settings_path + '/HighOutputLimit', 54.0, 30.0, 100.0],
|
||||
'LowOutputThreshold': [settings_path + '/LowOutputThreshold', 1500, 0, 10000],
|
||||
'HighOutputThreshold': [settings_path + '/HighOutputThreshold', 2500, 0, 10000],
|
||||
}
|
||||
|
||||
self.settings = SettingsDevice(
|
||||
self.bus,
|
||||
settings_def,
|
||||
self._on_persistent_setting_changed
|
||||
)
|
||||
|
||||
# Load saved settings
|
||||
if self.settings:
|
||||
self._load_settings()
|
||||
|
||||
self.logger.info("Persistent settings initialized")
|
||||
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Could not initialize persistent settings: {e}")
|
||||
self.logger.warning("Settings will not persist across restarts")
|
||||
|
||||
def _load_settings(self):
|
||||
"""Load settings from Venus localsettings"""
|
||||
if not self.settings:
|
||||
return
|
||||
|
||||
try:
|
||||
RAMP_CONFIG['initial_current'] = float(self.settings['InitialCurrent'])
|
||||
RAMP_CONFIG['target_current'] = float(self.settings['TargetCurrent'])
|
||||
|
||||
duration_min = int(self.settings['RampDuration'])
|
||||
delta = RAMP_CONFIG['target_current'] - RAMP_CONFIG['initial_current']
|
||||
RAMP_CONFIG['initial_ramp_rate'] = delta / duration_min if duration_min > 0 else 0.333
|
||||
|
||||
RAMP_CONFIG['cooldown_duration'] = int(self.settings['CooldownDuration']) * 60
|
||||
|
||||
self.enabled = bool(self.settings['Enabled'])
|
||||
|
||||
# Update D-Bus paths
|
||||
self.dbus_service['/Settings/InitialCurrent'] = RAMP_CONFIG['initial_current']
|
||||
self.dbus_service['/Settings/TargetCurrent'] = RAMP_CONFIG['target_current']
|
||||
self.dbus_service['/Settings/RampDuration'] = duration_min
|
||||
self.dbus_service['/Settings/CooldownDuration'] = int(self.settings['CooldownDuration'])
|
||||
self.dbus_service['/Settings/Enabled'] = 1 if self.enabled else 0
|
||||
self.dbus_service['/TargetLimit'] = RAMP_CONFIG['target_current']
|
||||
|
||||
# Return to stable after overload (optional keys for older installs)
|
||||
RAMP_CONFIG['return_to_stable_after_overload'] = bool(
|
||||
self.settings.get('ReturnToStableAfterOverload', 1))
|
||||
RAMP_CONFIG['return_to_stable_min_duration'] = int(
|
||||
self.settings.get('ReturnToStableMinMinutes', 30)) * 60
|
||||
self.dbus_service['/Settings/ReturnToStableAfterOverload'] = 1 if RAMP_CONFIG['return_to_stable_after_overload'] else 0
|
||||
self.dbus_service['/Settings/ReturnToStableMinMinutes'] = RAMP_CONFIG['return_to_stable_min_duration'] // 60
|
||||
|
||||
# Load power zone settings
|
||||
low_limit = float(self.settings['LowOutputLimit'])
|
||||
medium_limit = float(self.settings['MediumOutputLimit'])
|
||||
high_limit = float(self.settings['HighOutputLimit'])
|
||||
low_threshold = int(self.settings['LowOutputThreshold'])
|
||||
high_threshold = int(self.settings['HighOutputThreshold'])
|
||||
|
||||
# Update power zones configuration
|
||||
self._update_power_zones(
|
||||
low_limit=low_limit,
|
||||
medium_limit=medium_limit,
|
||||
high_limit=high_limit,
|
||||
low_threshold=low_threshold,
|
||||
high_threshold=high_threshold
|
||||
)
|
||||
|
||||
# Update D-Bus paths for power zones
|
||||
self.dbus_service['/Settings/LowOutputLimit'] = low_limit
|
||||
self.dbus_service['/Settings/MediumOutputLimit'] = medium_limit
|
||||
self.dbus_service['/Settings/HighOutputLimit'] = high_limit
|
||||
self.dbus_service['/Settings/LowOutputThreshold'] = low_threshold
|
||||
self.dbus_service['/Settings/HighOutputThreshold'] = high_threshold
|
||||
|
||||
self.logger.info(
|
||||
f"Loaded settings: {RAMP_CONFIG['initial_current']}A -> "
|
||||
f"{RAMP_CONFIG['target_current']}A over {duration_min}min"
|
||||
)
|
||||
self.logger.info(
|
||||
f"Power zones: {low_limit}A (0-{low_threshold}W), "
|
||||
f"{medium_limit}A ({low_threshold}-{high_threshold}W), "
|
||||
f"{high_limit}A ({high_threshold}W+)"
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Error loading settings: {e}")
|
||||
|
||||
def _on_persistent_setting_changed(self, setting, old_value, new_value):
|
||||
"""Called when a persistent setting changes externally"""
|
||||
self.logger.info(f"Persistent setting changed: {setting} = {new_value}")
|
||||
self._load_settings()
|
||||
|
||||
def _init_dbus_monitors(self):
|
||||
"""Initialize D-Bus service connections"""
|
||||
try:
|
||||
self.vebus_service = DBUS_CONFIG['vebus_service']
|
||||
self.generator_service = DBUS_CONFIG['generator_service']
|
||||
|
||||
self.logger.info(f"Monitoring VE.Bus: {self.vebus_service}")
|
||||
self.logger.info(f"Monitoring Generator: {self.generator_service}")
|
||||
|
||||
# Read initial values
|
||||
self._read_generator_state()
|
||||
self._read_ac_state()
|
||||
self._read_current_limit()
|
||||
|
||||
except dbus.exceptions.DBusException as e:
|
||||
self.logger.error(f"D-Bus initialization failed: {e}")
|
||||
raise
|
||||
|
||||
def _get_dbus_value(self, service, path):
|
||||
"""Get a value from D-Bus service"""
|
||||
try:
|
||||
obj = self.bus.get_object(service, path, introspect=False)
|
||||
return obj.GetValue(dbus_interface='com.victronenergy.BusItem')
|
||||
except dbus.exceptions.DBusException as e:
|
||||
self.logger.debug(f"Failed to get {service}{path}: {e}")
|
||||
return None
|
||||
|
||||
def _set_dbus_value(self, service, path, value):
|
||||
"""Set a value on D-Bus service"""
|
||||
try:
|
||||
obj = self.bus.get_object(service, path, introspect=False)
|
||||
# Wrap value in appropriate D-Bus type (variant)
|
||||
if isinstance(value, float):
|
||||
dbus_value = dbus.Double(value, variant_level=1)
|
||||
elif isinstance(value, int):
|
||||
dbus_value = dbus.Int32(value, variant_level=1)
|
||||
else:
|
||||
dbus_value = value
|
||||
obj.SetValue(dbus_value, dbus_interface='com.victronenergy.BusItem')
|
||||
self.logger.debug(f"Set {path} = {value}")
|
||||
return True
|
||||
except dbus.exceptions.DBusException as e:
|
||||
self.logger.error(f"Failed to set {path}: {e}")
|
||||
return False
|
||||
|
||||
def _read_generator_state(self):
|
||||
"""Read generator state from D-Bus"""
|
||||
value = self._get_dbus_value(self.generator_service, '/State')
|
||||
self.generator_state = int(value) if value is not None else 0
|
||||
self.dbus_service['/Generator/State'] = self.generator_state
|
||||
|
||||
def _read_ac_state(self):
|
||||
"""Read AC input connection state from D-Bus"""
|
||||
value = self._get_dbus_value(self.vebus_service, '/Ac/ActiveIn/Connected')
|
||||
self.ac_connected = bool(value) if value is not None else False
|
||||
self.dbus_service['/AcInput/Connected'] = 1 if self.ac_connected else 0
|
||||
|
||||
def _read_power(self):
|
||||
"""Read AC input power from D-Bus"""
|
||||
value = self._get_dbus_value(self.vebus_service, '/Ac/ActiveIn/L1/P')
|
||||
self.current_l1_power = float(value) if value is not None else 0
|
||||
|
||||
value = self._get_dbus_value(self.vebus_service, '/Ac/ActiveIn/L2/P')
|
||||
self.current_l2_power = float(value) if value is not None else 0
|
||||
|
||||
self.dbus_service['/Power/L1'] = self.current_l1_power
|
||||
self.dbus_service['/Power/L2'] = self.current_l2_power
|
||||
self.dbus_service['/Power/Total'] = self.current_l1_power + self.current_l2_power
|
||||
|
||||
def _read_current_limit(self):
|
||||
"""Read current input limit setting from D-Bus"""
|
||||
value = self._get_dbus_value(self.vebus_service, '/Ac/In/1/CurrentLimit')
|
||||
self.current_limit_setting = float(value) if value is not None else 0
|
||||
self.dbus_service['/CurrentLimit'] = self.current_limit_setting
|
||||
|
||||
def _read_output_power(self):
|
||||
"""Read AC output power (loads) from D-Bus"""
|
||||
value = self._get_dbus_value(self.vebus_service, '/Ac/Out/L1/P')
|
||||
self.output_l1_power = float(value) if value is not None else 0
|
||||
|
||||
value = self._get_dbus_value(self.vebus_service, '/Ac/Out/L2/P')
|
||||
self.output_l2_power = float(value) if value is not None else 0
|
||||
|
||||
total_output = self.output_l1_power + self.output_l2_power
|
||||
|
||||
self.dbus_service['/OutputPower/L1'] = self.output_l1_power
|
||||
self.dbus_service['/OutputPower/L2'] = self.output_l2_power
|
||||
self.dbus_service['/OutputPower/Total'] = total_output
|
||||
|
||||
# Update ramp controller with current output power
|
||||
self.ramp_controller.set_output_power(total_output)
|
||||
|
||||
def _set_current_limit(self, limit: float) -> bool:
|
||||
"""Set the input current limit"""
|
||||
limit = round(limit, 1)
|
||||
self.logger.info(f"Setting current limit to {limit}A")
|
||||
success = self._set_dbus_value(self.vebus_service, '/Ac/In/1/CurrentLimit', limit)
|
||||
if success:
|
||||
self.dbus_service['/CurrentLimit'] = limit
|
||||
return success
|
||||
|
||||
def _transition_to(self, new_state: int):
|
||||
"""Transition to a new controller state"""
|
||||
old_name = self.STATE_NAMES.get(self.state, 'Unknown')
|
||||
new_name = self.STATE_NAMES.get(new_state, 'Unknown')
|
||||
self.logger.info(f"State: {old_name} -> {new_name}")
|
||||
|
||||
self.state = new_state
|
||||
self.state_enter_time = time()
|
||||
self.dbus_service['/State'] = new_state
|
||||
|
||||
if new_state == self.STATE_IDLE:
|
||||
self.overload_detector.reset()
|
||||
self.ramp_controller.reset()
|
||||
self.dbus_service['/Ramp/Progress'] = 0
|
||||
self.dbus_service['/Ramp/TimeRemaining'] = 0
|
||||
|
||||
def _update_ramp_progress(self):
|
||||
"""Update ramp progress indicators"""
|
||||
if not self.ramp_controller.is_ramping:
|
||||
# When stable at target, show 100%
|
||||
if self.state == self.STATE_STABLE:
|
||||
self.dbus_service['/Ramp/Progress'] = 100
|
||||
self.dbus_service['/Ramp/TimeRemaining'] = 0
|
||||
return
|
||||
|
||||
current = self.ramp_controller.current_limit
|
||||
initial = RAMP_CONFIG['initial_current']
|
||||
target = self.ramp_controller.state.target_limit
|
||||
|
||||
if target > initial:
|
||||
progress = int(100 * (current - initial) / (target - initial))
|
||||
progress = max(0, min(100, progress))
|
||||
else:
|
||||
progress = 100
|
||||
|
||||
self.dbus_service['/Ramp/Progress'] = progress
|
||||
|
||||
if self.ramp_controller.state.is_recovery:
|
||||
rate = RAMP_CONFIG['recovery_ramp_rate']
|
||||
else:
|
||||
rate = RAMP_CONFIG['initial_ramp_rate']
|
||||
|
||||
remaining_amps = target - current
|
||||
if rate > 0:
|
||||
remaining_seconds = int(remaining_amps / rate * 60)
|
||||
else:
|
||||
remaining_seconds = 0
|
||||
|
||||
self.dbus_service['/Ramp/TimeRemaining'] = max(0, remaining_seconds)
|
||||
|
||||
def _main_loop(self) -> bool:
|
||||
"""Main control loop - called periodically by GLib."""
|
||||
try:
|
||||
now = time()
|
||||
|
||||
# Check if enabled
|
||||
if not self.enabled:
|
||||
return True
|
||||
|
||||
# Read current states from D-Bus
|
||||
self._read_generator_state()
|
||||
self._read_ac_state()
|
||||
self._read_power()
|
||||
self._read_output_power()
|
||||
self._read_current_limit()
|
||||
|
||||
# Check for generator stop
|
||||
if self.generator_state == GENERATOR_STATE['STOPPED']:
|
||||
if self.state != self.STATE_IDLE:
|
||||
self.logger.info("Generator stopped")
|
||||
self._transition_to(self.STATE_IDLE)
|
||||
return True
|
||||
|
||||
# Run overload detection
|
||||
is_overload = False
|
||||
if self.ac_connected and self.state in [self.STATE_RAMPING, self.STATE_RECOVERY, self.STATE_STABLE]:
|
||||
total_output_power = self.output_l1_power + self.output_l2_power
|
||||
is_overload, diag = self.overload_detector.update(
|
||||
self.current_l1_power,
|
||||
self.current_l2_power,
|
||||
now,
|
||||
output_power=total_output_power
|
||||
)
|
||||
if 'reversals' in diag:
|
||||
self.dbus_service['/Detection/Reversals'] = diag.get('reversals', 0)
|
||||
self.dbus_service['/Detection/StdDev'] = diag.get('std_dev', 0)
|
||||
self.dbus_service['/Detection/IsOverload'] = 1 if is_overload else 0
|
||||
self.dbus_service['/Detection/OutputPowerOk'] = 1 if diag.get('output_power_ok', True) else 0
|
||||
self.dbus_service['/Detection/Trend'] = diag.get('trend', 0.0)
|
||||
self.dbus_service['/Detection/TrendOk'] = 1 if diag.get('trend_ok', True) else 0
|
||||
self.dbus_service['/Detection/PowerDrop'] = diag.get('power_drop', 0.0)
|
||||
self.dbus_service['/Detection/PowerDropOk'] = 1 if diag.get('power_drop_ok', True) else 0
|
||||
|
||||
# State machine
|
||||
if self.state == self.STATE_IDLE:
|
||||
self._handle_idle(now)
|
||||
elif self.state == self.STATE_WARMUP:
|
||||
self._handle_warmup(now)
|
||||
elif self.state == self.STATE_RAMPING:
|
||||
self._handle_ramping(now, is_overload)
|
||||
elif self.state == self.STATE_COOLDOWN:
|
||||
self._handle_cooldown(now)
|
||||
elif self.state == self.STATE_RECOVERY:
|
||||
self._handle_recovery(now, is_overload)
|
||||
elif self.state == self.STATE_STABLE:
|
||||
self._handle_stable(now, is_overload)
|
||||
|
||||
# Update progress
|
||||
self._update_ramp_progress()
|
||||
|
||||
# Update status
|
||||
status = self.ramp_controller.get_status()
|
||||
self.dbus_service['/RecoveryTarget'] = status['recovery_target']
|
||||
self.dbus_service['/OverloadCount'] = status['overload_count']
|
||||
self.dbus_service['/LastStableCurrent'] = status['last_stable']
|
||||
|
||||
# Update fast recovery status
|
||||
self.dbus_service['/Recovery/InFastRamp'] = 1 if status.get('in_fast_ramp') else 0
|
||||
self.dbus_service['/Recovery/FastRampTarget'] = status.get('fast_ramp_target') or 0.0
|
||||
self.dbus_service['/Recovery/RapidOverloadCount'] = status.get('rapid_overload_count', 0)
|
||||
|
||||
# Update learning model status
|
||||
power_model = status.get('power_model', {})
|
||||
self.dbus_service['/Learning/Confidence'] = power_model.get('confidence', 0)
|
||||
self.dbus_service['/Learning/ConfidenceLevel'] = power_model.get('confidence_level', 'LOW')
|
||||
self.dbus_service['/Learning/BaseCurrent'] = power_model.get('base_current', 42.0)
|
||||
self.dbus_service['/Learning/SuggestedLimit'] = status.get('suggested_limit', 50.0)
|
||||
self.dbus_service['/Learning/DataPoints'] = power_model.get('data_points', 0)
|
||||
self.dbus_service['/Learning/OverloadPoints'] = power_model.get('overload_points', 0)
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Main loop error: {e}", exc_info=True)
|
||||
|
||||
return True
|
||||
|
||||
def _handle_idle(self, now: float):
|
||||
"""IDLE: Wait for generator to start warm-up"""
|
||||
if self.generator_state == GENERATOR_STATE['WARMUP']:
|
||||
self.logger.info("Generator warm-up detected")
|
||||
new_limit = self.ramp_controller.set_initial_limit(self.current_limit_setting)
|
||||
self._set_current_limit(new_limit)
|
||||
self._transition_to(self.STATE_WARMUP)
|
||||
|
||||
elif self.generator_state == GENERATOR_STATE['RUNNING'] and self.ac_connected:
|
||||
self.logger.info("Generator already running with AC connected")
|
||||
new_limit = self.ramp_controller.set_initial_limit(self.current_limit_setting)
|
||||
self._set_current_limit(new_limit)
|
||||
self.ramp_controller.start_ramp(now)
|
||||
self.overload_detector.set_ramp_start(now)
|
||||
self._transition_to(self.STATE_RAMPING)
|
||||
|
||||
def _handle_warmup(self, now: float):
|
||||
"""WARMUP: Generator warming up, AC not yet connected"""
|
||||
if self.generator_state == GENERATOR_STATE['RUNNING'] and self.ac_connected:
|
||||
self.logger.info("Warm-up complete, AC connected - starting ramp")
|
||||
self.ramp_controller.start_ramp(now)
|
||||
self.overload_detector.set_ramp_start(now)
|
||||
self._transition_to(self.STATE_RAMPING)
|
||||
elif self.generator_state not in [GENERATOR_STATE['WARMUP'], GENERATOR_STATE['RUNNING']]:
|
||||
self._transition_to(self.STATE_IDLE)
|
||||
|
||||
def _handle_ramping(self, now: float, is_overload: bool):
|
||||
"""RAMPING: Increasing current limit"""
|
||||
if is_overload:
|
||||
self._handle_overload_event(now)
|
||||
return
|
||||
new_limit = self.ramp_controller.update(now)
|
||||
if new_limit is not None:
|
||||
self._set_current_limit(new_limit)
|
||||
if not self.ramp_controller.is_ramping:
|
||||
self.logger.info("Ramp complete, entering stable state")
|
||||
self._transition_to(self.STATE_STABLE)
|
||||
|
||||
def _handle_cooldown(self, now: float):
|
||||
"""COOLDOWN: Waiting at initial current after overload"""
|
||||
elapsed = now - self.state_enter_time
|
||||
remaining = RAMP_CONFIG['cooldown_duration'] - elapsed
|
||||
self.dbus_service['/Ramp/TimeRemaining'] = max(0, int(remaining))
|
||||
|
||||
if elapsed >= RAMP_CONFIG['cooldown_duration']:
|
||||
self.logger.info("Cooldown complete, starting recovery ramp")
|
||||
self.ramp_controller.start_ramp(now)
|
||||
self.overload_detector.set_ramp_start(now)
|
||||
self._transition_to(self.STATE_RECOVERY)
|
||||
|
||||
def _handle_recovery(self, now: float, is_overload: bool):
|
||||
"""RECOVERY: Ramping back up after overload"""
|
||||
if is_overload:
|
||||
self._handle_overload_event(now)
|
||||
return
|
||||
|
||||
# Check if output power increased - may allow higher target
|
||||
result = self.ramp_controller.check_output_power_increase(now)
|
||||
if result:
|
||||
self.logger.info(
|
||||
f"Output power increase detected during recovery: "
|
||||
f"target raised to {result['new_target']}A"
|
||||
)
|
||||
|
||||
new_limit = self.ramp_controller.update(now)
|
||||
if new_limit is not None:
|
||||
self._set_current_limit(new_limit)
|
||||
if not self.ramp_controller.is_ramping:
|
||||
self.logger.info("Recovery complete, entering stable state")
|
||||
self._transition_to(self.STATE_STABLE)
|
||||
|
||||
def _handle_stable(self, now: float, is_overload: bool):
|
||||
"""STABLE: Maintaining current limit, monitoring for overload"""
|
||||
if is_overload:
|
||||
self._handle_overload_event(now)
|
||||
return
|
||||
|
||||
# Check if output power increased - may allow higher target
|
||||
result = self.ramp_controller.check_output_power_increase(now)
|
||||
if result and result.get('should_ramp'):
|
||||
self.logger.info(
|
||||
f"Output power increased to {result['output_power']:.0f}W, "
|
||||
f"raising target to {result['new_target']}A and resuming ramp"
|
||||
)
|
||||
self.ramp_controller.start_ramp(now)
|
||||
self.overload_detector.set_ramp_start(now)
|
||||
self._transition_to(self.STATE_RECOVERY)
|
||||
return
|
||||
|
||||
if self.ramp_controller.check_history_clear(now):
|
||||
self.logger.info("Overload history cleared after stable operation")
|
||||
if self.ramp_controller.should_retry_full_power():
|
||||
self.logger.info("Attempting full power ramp")
|
||||
self.ramp_controller.start_ramp(now)
|
||||
self.overload_detector.set_ramp_start(now)
|
||||
self._transition_to(self.STATE_RAMPING)
|
||||
|
||||
def _handle_overload_event(self, now: float):
|
||||
"""Handle an overload detection"""
|
||||
output_power = self.output_l1_power + self.output_l2_power
|
||||
current_limit = self.current_limit_setting
|
||||
|
||||
# Dump verbose debug info BEFORE resetting the detector
|
||||
self.overload_detector.dump_overload_debug(
|
||||
current_limit=current_limit,
|
||||
output_power=output_power
|
||||
)
|
||||
|
||||
result = self.ramp_controller.handle_overload(now, output_power=output_power)
|
||||
|
||||
rapid_info = ""
|
||||
if result.get('is_rapid_overload'):
|
||||
rapid_info = f" [RAPID #{result['rapid_overload_count']}]"
|
||||
|
||||
self.logger.warning(
|
||||
f"Overload #{result['overload_count']}{rapid_info}: "
|
||||
f"rolling back to {result['new_limit']}A, "
|
||||
f"recovery target: {result['recovery_target']}A, "
|
||||
f"fast target: {result['fast_recovery_target']:.1f}A "
|
||||
f"(output: {output_power:.0f}W)"
|
||||
)
|
||||
self._set_current_limit(result['new_limit'])
|
||||
self.overload_detector.reset()
|
||||
self._transition_to(self.STATE_COOLDOWN)
|
||||
|
||||
|
||||
def main():
|
||||
"""Main entry point"""
|
||||
DBusGMainLoop(set_as_default=True)
|
||||
|
||||
print("=" * 60)
|
||||
print(f"Generator Current Ramp Controller v{VERSION}")
|
||||
print("=" * 60)
|
||||
|
||||
mainloop = None
|
||||
|
||||
def signal_handler(signum, frame):
|
||||
"""Handle shutdown signals gracefully"""
|
||||
sig_name = signal.Signals(signum).name
|
||||
logging.info(f"Received {sig_name}, shutting down...")
|
||||
if mainloop is not None:
|
||||
mainloop.quit()
|
||||
|
||||
# Register signal handlers for graceful shutdown
|
||||
signal.signal(signal.SIGTERM, signal_handler)
|
||||
signal.signal(signal.SIGINT, signal_handler)
|
||||
|
||||
try:
|
||||
controller = GeneratorRampController()
|
||||
mainloop = GLib.MainLoop()
|
||||
mainloop.run()
|
||||
except KeyboardInterrupt:
|
||||
print("\nShutdown requested")
|
||||
except Exception as e:
|
||||
logging.error(f"Fatal error: {e}", exc_info=True)
|
||||
sys.exit(1)
|
||||
finally:
|
||||
logging.info("Service stopped")
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
449
dbus-generator-ramp/debug_input_tracker.py
Executable file
449
dbus-generator-ramp/debug_input_tracker.py
Executable file
@@ -0,0 +1,449 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Debug Input Tracker for Generator Overload Detection
|
||||
|
||||
Monitors and outputs all tracked input variables to help debug
|
||||
generator overload detection. Samples internally at dbus rate (2Hz)
|
||||
while allowing configurable output frequency.
|
||||
|
||||
Usage:
|
||||
./debug_input_tracker.py # Output every 2 seconds
|
||||
./debug_input_tracker.py --interval 5 # Output every 5 seconds
|
||||
./debug_input_tracker.py --interval 0.5 # Output every sample
|
||||
./debug_input_tracker.py --csv # CSV output format
|
||||
./debug_input_tracker.py --verbose # Include raw buffer data
|
||||
|
||||
Author: Claude (Anthropic)
|
||||
License: MIT
|
||||
"""
|
||||
|
||||
import sys
|
||||
import os
|
||||
import argparse
|
||||
from time import time, sleep
|
||||
from datetime import datetime
|
||||
from collections import deque
|
||||
|
||||
# Add velib_python to path (Venus OS standard location)
|
||||
sys.path.insert(1, os.path.join(os.path.dirname(__file__), 'ext', 'velib_python'))
|
||||
sys.path.insert(1, '/opt/victronenergy/velib_python')
|
||||
|
||||
try:
|
||||
import dbus
|
||||
except ImportError:
|
||||
print("ERROR: dbus not available. This script must run on Venus OS.")
|
||||
print("For development testing, use --mock flag.")
|
||||
dbus = None
|
||||
|
||||
from config import DBUS_CONFIG, OVERLOAD_CONFIG
|
||||
from overload_detector import OverloadDetector
|
||||
|
||||
|
||||
class InputTracker:
|
||||
"""
|
||||
Tracks generator input power and deviation metrics for debugging.
|
||||
"""
|
||||
|
||||
def __init__(self, use_mock=False):
|
||||
self.use_mock = use_mock
|
||||
self.bus = None
|
||||
|
||||
if not use_mock:
|
||||
if dbus is None:
|
||||
raise RuntimeError("dbus module not available")
|
||||
self.bus = dbus.SystemBus()
|
||||
|
||||
# Services
|
||||
self.vebus_service = DBUS_CONFIG['vebus_service']
|
||||
self.generator_service = DBUS_CONFIG['generator_service']
|
||||
|
||||
# Overload detector for deviation tracking
|
||||
self.detector = OverloadDetector(OVERLOAD_CONFIG)
|
||||
|
||||
# Internal sample buffers for statistics
|
||||
self.sample_count = 0
|
||||
self.power_samples = deque(maxlen=100) # Last 100 samples for stats
|
||||
|
||||
# Track min/max/avg since last output
|
||||
self.reset_interval_stats()
|
||||
|
||||
# Mock state for testing
|
||||
self._mock_time = 0
|
||||
self._mock_base_power = 8000
|
||||
|
||||
def reset_interval_stats(self):
|
||||
"""Reset statistics tracked between output intervals"""
|
||||
self.interval_samples = 0
|
||||
self.interval_power_min = float('inf')
|
||||
self.interval_power_max = float('-inf')
|
||||
self.interval_power_sum = 0
|
||||
self.interval_overload_triggers = 0
|
||||
|
||||
def _get_dbus_value(self, service, path):
|
||||
"""Get a value from D-Bus service"""
|
||||
if self.use_mock:
|
||||
return self._get_mock_value(service, path)
|
||||
try:
|
||||
obj = self.bus.get_object(service, path, introspect=False)
|
||||
return obj.GetValue(dbus_interface='com.victronenergy.BusItem')
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
def _get_mock_value(self, service, path):
|
||||
"""Return mock values for development testing"""
|
||||
import random
|
||||
import math
|
||||
|
||||
self._mock_time += 0.5
|
||||
|
||||
if path == '/Ac/ActiveIn/L1/P':
|
||||
# Simulate power with some oscillation
|
||||
oscillation = 200 * math.sin(self._mock_time * 0.5)
|
||||
noise = random.gauss(0, 50)
|
||||
return self._mock_base_power + oscillation + noise
|
||||
elif path == '/Ac/ActiveIn/L2/P':
|
||||
return 0 # Single phase
|
||||
elif path == '/Ac/ActiveIn/Connected':
|
||||
return 1
|
||||
elif path == '/State':
|
||||
return 1 # Running
|
||||
elif path == '/Ac/In/1/CurrentLimit':
|
||||
return 45.0
|
||||
return None
|
||||
|
||||
def read_inputs(self):
|
||||
"""Read all input values from dbus"""
|
||||
# Power readings
|
||||
l1_power = self._get_dbus_value(self.vebus_service, '/Ac/ActiveIn/L1/P')
|
||||
l2_power = self._get_dbus_value(self.vebus_service, '/Ac/ActiveIn/L2/P')
|
||||
|
||||
l1_power = float(l1_power) if l1_power is not None else 0.0
|
||||
l2_power = float(l2_power) if l2_power is not None else 0.0
|
||||
total_power = l1_power + l2_power
|
||||
|
||||
# AC connection status
|
||||
ac_connected = self._get_dbus_value(self.vebus_service, '/Ac/ActiveIn/Connected')
|
||||
ac_connected = bool(ac_connected) if ac_connected is not None else False
|
||||
|
||||
# Generator state
|
||||
gen_state = self._get_dbus_value(self.generator_service, '/State')
|
||||
gen_state = int(gen_state) if gen_state is not None else 0
|
||||
|
||||
# Current limit
|
||||
current_limit = self._get_dbus_value(self.vebus_service, '/Ac/In/1/CurrentLimit')
|
||||
current_limit = float(current_limit) if current_limit is not None else 0.0
|
||||
|
||||
return {
|
||||
'timestamp': time(),
|
||||
'l1_power': l1_power,
|
||||
'l2_power': l2_power,
|
||||
'total_power': total_power,
|
||||
'ac_connected': ac_connected,
|
||||
'generator_state': gen_state,
|
||||
'current_limit': current_limit,
|
||||
}
|
||||
|
||||
def sample(self):
|
||||
"""
|
||||
Take a single sample and update internal tracking.
|
||||
Returns sample data with overload detection diagnostics.
|
||||
"""
|
||||
inputs = self.read_inputs()
|
||||
|
||||
# Update overload detector
|
||||
is_overload, diag = self.detector.update(
|
||||
inputs['l1_power'],
|
||||
inputs['l2_power'],
|
||||
inputs['timestamp']
|
||||
)
|
||||
|
||||
self.sample_count += 1
|
||||
self.power_samples.append(inputs['total_power'])
|
||||
|
||||
# Update interval statistics
|
||||
self.interval_samples += 1
|
||||
self.interval_power_sum += inputs['total_power']
|
||||
self.interval_power_min = min(self.interval_power_min, inputs['total_power'])
|
||||
self.interval_power_max = max(self.interval_power_max, inputs['total_power'])
|
||||
if is_overload:
|
||||
self.interval_overload_triggers += 1
|
||||
|
||||
return {
|
||||
**inputs,
|
||||
'is_overload': is_overload,
|
||||
'diagnostics': diag,
|
||||
'sample_number': self.sample_count,
|
||||
}
|
||||
|
||||
def get_interval_stats(self):
|
||||
"""Get statistics for the current output interval"""
|
||||
if self.interval_samples == 0:
|
||||
return None
|
||||
|
||||
return {
|
||||
'samples': self.interval_samples,
|
||||
'power_min': self.interval_power_min,
|
||||
'power_max': self.interval_power_max,
|
||||
'power_avg': self.interval_power_sum / self.interval_samples,
|
||||
'power_range': self.interval_power_max - self.interval_power_min,
|
||||
'overload_triggers': self.interval_overload_triggers,
|
||||
}
|
||||
|
||||
def get_buffer_stats(self):
|
||||
"""Get statistics from the power sample buffer"""
|
||||
if len(self.power_samples) < 2:
|
||||
return None
|
||||
|
||||
samples = list(self.power_samples)
|
||||
n = len(samples)
|
||||
mean = sum(samples) / n
|
||||
variance = sum((x - mean) ** 2 for x in samples) / n
|
||||
std_dev = variance ** 0.5
|
||||
|
||||
return {
|
||||
'buffer_size': n,
|
||||
'mean': mean,
|
||||
'std_dev': std_dev,
|
||||
'min': min(samples),
|
||||
'max': max(samples),
|
||||
}
|
||||
|
||||
|
||||
def format_table_output(sample, interval_stats, buffer_stats, verbose=False):
|
||||
"""Format output as a readable table"""
|
||||
lines = []
|
||||
|
||||
# Header with timestamp
|
||||
ts = datetime.fromtimestamp(sample['timestamp']).strftime('%Y-%m-%d %H:%M:%S.%f')[:-3]
|
||||
lines.append(f"\n{'='*70}")
|
||||
lines.append(f"Debug Input Tracker - {ts}")
|
||||
lines.append(f"{'='*70}")
|
||||
|
||||
# Generator Status
|
||||
gen_states = {0: 'Stopped', 1: 'Running', 2: 'Warm-up', 3: 'Cool-down', 10: 'Error'}
|
||||
gen_state_name = gen_states.get(sample['generator_state'], f"Unknown({sample['generator_state']})")
|
||||
|
||||
lines.append(f"\n--- Generator Status ---")
|
||||
lines.append(f" State: {gen_state_name}")
|
||||
lines.append(f" AC Connected: {'Yes' if sample['ac_connected'] else 'No'}")
|
||||
lines.append(f" Current Limit: {sample['current_limit']:.1f} A")
|
||||
|
||||
# Power Readings
|
||||
lines.append(f"\n--- Power Readings ---")
|
||||
lines.append(f" L1 Power: {sample['l1_power']:>8.1f} W")
|
||||
lines.append(f" L2 Power: {sample['l2_power']:>8.1f} W")
|
||||
lines.append(f" Total Power: {sample['total_power']:>8.1f} W")
|
||||
|
||||
# Interval Statistics
|
||||
if interval_stats:
|
||||
lines.append(f"\n--- Interval Stats ({interval_stats['samples']} samples) ---")
|
||||
lines.append(f" Power Min: {interval_stats['power_min']:>8.1f} W")
|
||||
lines.append(f" Power Max: {interval_stats['power_max']:>8.1f} W")
|
||||
lines.append(f" Power Avg: {interval_stats['power_avg']:>8.1f} W")
|
||||
lines.append(f" Power Range: {interval_stats['power_range']:>8.1f} W")
|
||||
|
||||
# Deviation/Detection Metrics
|
||||
diag = sample['diagnostics']
|
||||
lines.append(f"\n--- Overload Detection ---")
|
||||
|
||||
if diag.get('status') == 'warming_up':
|
||||
lines.append(f" Status: Warming up ({diag.get('samples', 0)}/{diag.get('needed', '?')} samples)")
|
||||
elif diag.get('status') == 'lockout':
|
||||
lines.append(f" Status: Lockout ({diag.get('lockout_remaining', 0):.1f}s remaining)")
|
||||
else:
|
||||
# Method 1: Reversals
|
||||
reversals = diag.get('reversals', 0)
|
||||
reversal_thresh = diag.get('reversal_threshold', OVERLOAD_CONFIG['reversal_threshold'])
|
||||
method1 = diag.get('method1_triggered', False)
|
||||
lines.append(f" Reversals: {reversals:>3d} / {reversal_thresh} threshold {'[TRIGGERED]' if method1 else ''}")
|
||||
|
||||
# Method 2: Std Dev
|
||||
std_dev = diag.get('std_dev', 0)
|
||||
std_thresh = diag.get('std_dev_threshold', OVERLOAD_CONFIG['std_dev_threshold'])
|
||||
method2 = diag.get('method2_triggered', False)
|
||||
lines.append(f" Std Deviation: {std_dev:>7.1f} W / {std_thresh} W threshold {'[TRIGGERED]' if method2 else ''}")
|
||||
|
||||
# Combined detection
|
||||
instant = diag.get('instant_detection', False)
|
||||
confirmed = diag.get('confirmed_count', 0)
|
||||
confirm_thresh = diag.get('confirmation_threshold', OVERLOAD_CONFIG['confirmation_threshold'])
|
||||
lines.append(f" Instant Detect: {'Yes' if instant else 'No'}")
|
||||
lines.append(f" Confirmations: {confirmed:>3d} / {confirm_thresh} threshold")
|
||||
|
||||
# Final overload status
|
||||
is_overload = sample['is_overload']
|
||||
status = "*** OVERLOAD DETECTED ***" if is_overload else "Normal"
|
||||
lines.append(f" Status: {status}")
|
||||
|
||||
# Verbose buffer data
|
||||
if verbose and buffer_stats:
|
||||
lines.append(f"\n--- Buffer Statistics ({buffer_stats['buffer_size']} samples) ---")
|
||||
lines.append(f" Mean Power: {buffer_stats['mean']:>8.1f} W")
|
||||
lines.append(f" Std Deviation: {buffer_stats['std_dev']:>8.1f} W")
|
||||
lines.append(f" Min: {buffer_stats['min']:>8.1f} W")
|
||||
lines.append(f" Max: {buffer_stats['max']:>8.1f} W")
|
||||
|
||||
lines.append(f"\n Sample #: {sample['sample_number']}")
|
||||
|
||||
return '\n'.join(lines)
|
||||
|
||||
|
||||
def format_csv_output(sample, interval_stats, include_header=False):
|
||||
"""Format output as CSV"""
|
||||
headers = [
|
||||
'timestamp', 'sample_num', 'gen_state', 'ac_connected', 'current_limit',
|
||||
'l1_power', 'l2_power', 'total_power',
|
||||
'reversals', 'std_dev', 'method1_triggered', 'method2_triggered',
|
||||
'instant_detection', 'confirmed_count', 'is_overload',
|
||||
'interval_samples', 'power_min', 'power_max', 'power_avg', 'power_range'
|
||||
]
|
||||
|
||||
diag = sample['diagnostics']
|
||||
|
||||
values = [
|
||||
datetime.fromtimestamp(sample['timestamp']).strftime('%Y-%m-%d %H:%M:%S.%f')[:-3],
|
||||
sample['sample_number'],
|
||||
sample['generator_state'],
|
||||
1 if sample['ac_connected'] else 0,
|
||||
f"{sample['current_limit']:.1f}",
|
||||
f"{sample['l1_power']:.1f}",
|
||||
f"{sample['l2_power']:.1f}",
|
||||
f"{sample['total_power']:.1f}",
|
||||
diag.get('reversals', ''),
|
||||
f"{diag.get('std_dev', 0):.1f}" if 'std_dev' in diag else '',
|
||||
1 if diag.get('method1_triggered', False) else 0,
|
||||
1 if diag.get('method2_triggered', False) else 0,
|
||||
1 if diag.get('instant_detection', False) else 0,
|
||||
diag.get('confirmed_count', ''),
|
||||
1 if sample['is_overload'] else 0,
|
||||
]
|
||||
|
||||
# Interval stats
|
||||
if interval_stats:
|
||||
values.extend([
|
||||
interval_stats['samples'],
|
||||
f"{interval_stats['power_min']:.1f}",
|
||||
f"{interval_stats['power_max']:.1f}",
|
||||
f"{interval_stats['power_avg']:.1f}",
|
||||
f"{interval_stats['power_range']:.1f}",
|
||||
])
|
||||
else:
|
||||
values.extend(['', '', '', '', ''])
|
||||
|
||||
lines = []
|
||||
if include_header:
|
||||
lines.append(','.join(headers))
|
||||
lines.append(','.join(str(v) for v in values))
|
||||
|
||||
return '\n'.join(lines)
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(
|
||||
description='Debug tracker for generator input variables and overload detection',
|
||||
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||
epilog="""
|
||||
Examples:
|
||||
%(prog)s # Output every 2 seconds (default)
|
||||
%(prog)s --interval 5 # Output every 5 seconds
|
||||
%(prog)s --interval 0.5 # Output every sample (500ms)
|
||||
%(prog)s --csv # CSV format for logging
|
||||
%(prog)s --csv > debug.csv # Log to file
|
||||
%(prog)s --verbose # Include buffer statistics
|
||||
%(prog)s --mock # Use mock data for testing
|
||||
"""
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
'-i', '--interval',
|
||||
type=float,
|
||||
default=2.0,
|
||||
help='Output interval in seconds (default: 2.0, min: 0.5)'
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
'--csv',
|
||||
action='store_true',
|
||||
help='Output in CSV format'
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbose',
|
||||
action='store_true',
|
||||
help='Include detailed buffer statistics'
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
'--mock',
|
||||
action='store_true',
|
||||
help='Use mock data for development testing'
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
'--sample-rate',
|
||||
type=float,
|
||||
default=0.5,
|
||||
help='Internal sample rate in seconds (default: 0.5 = 2Hz dbus rate)'
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Validate interval
|
||||
if args.interval < args.sample_rate:
|
||||
args.interval = args.sample_rate
|
||||
|
||||
# Initialize tracker
|
||||
try:
|
||||
tracker = InputTracker(use_mock=args.mock)
|
||||
except RuntimeError as e:
|
||||
print(f"ERROR: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
if not args.csv:
|
||||
print("Starting debug input tracker...")
|
||||
print(f" Sample rate: {args.sample_rate}s ({1/args.sample_rate:.1f} Hz)")
|
||||
print(f" Output interval: {args.interval}s")
|
||||
print(f" Press Ctrl+C to stop\n")
|
||||
|
||||
# Output CSV header
|
||||
first_output = True
|
||||
|
||||
# Timing
|
||||
last_output_time = 0
|
||||
|
||||
try:
|
||||
while True:
|
||||
now = time()
|
||||
|
||||
# Take a sample
|
||||
sample = tracker.sample()
|
||||
|
||||
# Check if it's time to output
|
||||
if now - last_output_time >= args.interval:
|
||||
interval_stats = tracker.get_interval_stats()
|
||||
buffer_stats = tracker.get_buffer_stats() if args.verbose else None
|
||||
|
||||
if args.csv:
|
||||
print(format_csv_output(sample, interval_stats, include_header=first_output))
|
||||
else:
|
||||
print(format_table_output(sample, interval_stats, buffer_stats, args.verbose))
|
||||
|
||||
# Flush for real-time output when piping
|
||||
sys.stdout.flush()
|
||||
|
||||
# Reset interval tracking
|
||||
tracker.reset_interval_stats()
|
||||
last_output_time = now
|
||||
first_output = False
|
||||
|
||||
# Wait for next sample
|
||||
sleep(args.sample_rate)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
if not args.csv:
|
||||
print("\n\nStopped.")
|
||||
sys.exit(0)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
224
dbus-generator-ramp/deploy.sh
Executable file
224
dbus-generator-ramp/deploy.sh
Executable file
@@ -0,0 +1,224 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# Deploy script for Generator Current Ramp Controller
|
||||
#
|
||||
# Deploys a built package to a CerboGX device via SSH.
|
||||
#
|
||||
# Usage:
|
||||
# ./deploy.sh <cerbo-ip> [package.tar.gz]
|
||||
# ./deploy.sh 192.168.1.100 # Uses latest package in current dir
|
||||
# ./deploy.sh 192.168.1.100 dbus-generator-ramp-1.0.0.tar.gz
|
||||
# ./deploy.sh 192.168.1.100 --webui # Also install web UI
|
||||
#
|
||||
# Prerequisites:
|
||||
# - SSH access enabled on CerboGX (Settings > General > SSH)
|
||||
# - Package built with ./build-package.sh
|
||||
#
|
||||
|
||||
set -e
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
# Script directory
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
# Default values
|
||||
CERBO_IP=""
|
||||
PACKAGE=""
|
||||
INSTALL_WEBUI=""
|
||||
SSH_USER="root"
|
||||
REMOTE_DIR="/data"
|
||||
|
||||
# Print usage
|
||||
usage() {
|
||||
echo "Usage: $0 <cerbo-ip> [OPTIONS] [package.tar.gz]"
|
||||
echo ""
|
||||
echo "Arguments:"
|
||||
echo " cerbo-ip IP address of the CerboGX"
|
||||
echo " package.tar.gz Package file (default: latest in current directory)"
|
||||
echo ""
|
||||
echo "Options:"
|
||||
echo " --webui Also install the web UI service"
|
||||
echo " --user USER SSH user (default: root)"
|
||||
echo " -h, --help Show this help message"
|
||||
echo ""
|
||||
echo "Examples:"
|
||||
echo " $0 192.168.1.100"
|
||||
echo " $0 192.168.1.100 --webui"
|
||||
echo " $0 192.168.1.100 dbus-generator-ramp-1.2.0.tar.gz"
|
||||
echo ""
|
||||
echo "Prerequisites:"
|
||||
echo " 1. Enable SSH on CerboGX: Settings > General > SSH"
|
||||
echo " 2. Build package first: ./build-package.sh"
|
||||
exit 1
|
||||
}
|
||||
|
||||
# Parse arguments
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case $1 in
|
||||
--webui)
|
||||
INSTALL_WEBUI="--webui"
|
||||
shift
|
||||
;;
|
||||
--user)
|
||||
SSH_USER="$2"
|
||||
shift 2
|
||||
;;
|
||||
-h|--help)
|
||||
usage
|
||||
;;
|
||||
-*)
|
||||
echo -e "${RED}Error: Unknown option: $1${NC}"
|
||||
usage
|
||||
;;
|
||||
*)
|
||||
if [ -z "$CERBO_IP" ]; then
|
||||
CERBO_IP="$1"
|
||||
elif [ -z "$PACKAGE" ]; then
|
||||
PACKAGE="$1"
|
||||
else
|
||||
echo -e "${RED}Error: Too many arguments${NC}"
|
||||
usage
|
||||
fi
|
||||
shift
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
# Check required arguments
|
||||
if [ -z "$CERBO_IP" ]; then
|
||||
echo -e "${RED}Error: CerboGX IP address required${NC}"
|
||||
echo ""
|
||||
usage
|
||||
fi
|
||||
|
||||
# Find package if not specified
|
||||
if [ -z "$PACKAGE" ]; then
|
||||
# Look for latest package in script directory
|
||||
PACKAGE=$(ls -t "$SCRIPT_DIR"/dbus-generator-ramp-*.tar.gz 2>/dev/null | head -1)
|
||||
if [ -z "$PACKAGE" ]; then
|
||||
echo -e "${RED}Error: No package found. Run ./build-package.sh first${NC}"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Verify package exists
|
||||
if [ ! -f "$PACKAGE" ]; then
|
||||
# Try in script directory
|
||||
if [ -f "$SCRIPT_DIR/$PACKAGE" ]; then
|
||||
PACKAGE="$SCRIPT_DIR/$PACKAGE"
|
||||
else
|
||||
echo -e "${RED}Error: Package not found: $PACKAGE${NC}"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
PACKAGE_NAME=$(basename "$PACKAGE")
|
||||
|
||||
echo "=================================================="
|
||||
echo "Deploying to CerboGX"
|
||||
echo "=================================================="
|
||||
echo "Target: $SSH_USER@$CERBO_IP"
|
||||
echo "Package: $PACKAGE_NAME"
|
||||
echo "Web UI: ${INSTALL_WEBUI:-no}"
|
||||
echo ""
|
||||
|
||||
# Test SSH connection
|
||||
echo -e "${YELLOW}1. Testing SSH connection...${NC}"
|
||||
if ! ssh -o ConnectTimeout=5 -o BatchMode=yes "$SSH_USER@$CERBO_IP" "echo 'SSH OK'" 2>/dev/null; then
|
||||
echo -e "${RED} SSH connection failed.${NC}"
|
||||
echo ""
|
||||
echo " Troubleshooting:"
|
||||
echo " - Verify CerboGX IP address: $CERBO_IP"
|
||||
echo " - Enable SSH: Settings > General > SSH on the CerboGX"
|
||||
echo " - Check password/key authentication"
|
||||
echo ""
|
||||
echo " Try manual connection:"
|
||||
echo " ssh $SSH_USER@$CERBO_IP"
|
||||
exit 1
|
||||
fi
|
||||
echo -e "${GREEN} SSH connection OK${NC}"
|
||||
|
||||
# Copy package
|
||||
echo -e "${YELLOW}2. Copying package to CerboGX...${NC}"
|
||||
scp "$PACKAGE" "$SSH_USER@$CERBO_IP:$REMOTE_DIR/"
|
||||
echo -e "${GREEN} Package copied${NC}"
|
||||
|
||||
# Install on CerboGX
|
||||
echo -e "${YELLOW}3. Installing on CerboGX...${NC}"
|
||||
ssh "$SSH_USER@$CERBO_IP" bash <<EOF
|
||||
set -e
|
||||
cd $REMOTE_DIR
|
||||
|
||||
# Stop existing service if running
|
||||
if [ -d /service/dbus-generator-ramp ]; then
|
||||
echo " Stopping existing service..."
|
||||
svc -d /service/dbus-generator-ramp 2>/dev/null || true
|
||||
# Wait for service to fully stop and release D-Bus name
|
||||
sleep 3
|
||||
fi
|
||||
|
||||
# Remove old installation directory (but keep learned data)
|
||||
if [ -d $REMOTE_DIR/dbus-generator-ramp ]; then
|
||||
echo " Backing up learned model..."
|
||||
if [ -f $REMOTE_DIR/dbus-generator-ramp/learned_model.json ]; then
|
||||
cp $REMOTE_DIR/dbus-generator-ramp/learned_model.json /tmp/learned_model.json.bak 2>/dev/null || true
|
||||
fi
|
||||
echo " Removing old installation..."
|
||||
rm -rf $REMOTE_DIR/dbus-generator-ramp
|
||||
fi
|
||||
|
||||
# Extract new package
|
||||
echo " Extracting package..."
|
||||
tar -xzf $PACKAGE_NAME
|
||||
|
||||
# Restore learned model if it existed
|
||||
if [ -f /tmp/learned_model.json.bak ]; then
|
||||
echo " Restoring learned model..."
|
||||
mkdir -p $REMOTE_DIR/dbus-generator-ramp
|
||||
mv /tmp/learned_model.json.bak $REMOTE_DIR/dbus-generator-ramp/learned_model.json
|
||||
fi
|
||||
|
||||
# Run install script
|
||||
echo " Running install.sh..."
|
||||
cd $REMOTE_DIR/dbus-generator-ramp
|
||||
./install.sh $INSTALL_WEBUI
|
||||
|
||||
# Clean up package file
|
||||
rm -f $REMOTE_DIR/$PACKAGE_NAME
|
||||
EOF
|
||||
|
||||
echo -e "${GREEN} Installation complete${NC}"
|
||||
|
||||
# Check service status
|
||||
echo -e "${YELLOW}4. Checking service status...${NC}"
|
||||
sleep 2
|
||||
ssh "$SSH_USER@$CERBO_IP" bash <<'EOF'
|
||||
if command -v svstat >/dev/null 2>&1; then
|
||||
echo ""
|
||||
svstat /service/dbus-generator-ramp 2>/dev/null || echo " Service not yet supervised"
|
||||
echo ""
|
||||
echo "Recent logs:"
|
||||
if [ -f /var/log/dbus-generator-ramp/current ]; then
|
||||
cat /var/log/dbus-generator-ramp/current | tai64nlocal 2>/dev/null | tail -10
|
||||
else
|
||||
echo " No logs yet"
|
||||
fi
|
||||
fi
|
||||
EOF
|
||||
|
||||
echo ""
|
||||
echo "=================================================="
|
||||
echo -e "${GREEN}Deployment complete!${NC}"
|
||||
echo "=================================================="
|
||||
echo ""
|
||||
echo "Useful commands (on CerboGX):"
|
||||
echo " svstat /service/dbus-generator-ramp"
|
||||
echo " tail -F /var/log/dbus-generator-ramp/current | tai64nlocal"
|
||||
echo " svc -t /service/dbus-generator-ramp # restart"
|
||||
echo " svc -d /service/dbus-generator-ramp # stop"
|
||||
echo ""
|
||||
62
dbus-generator-ramp/docker-compose.yml
Normal file
62
dbus-generator-ramp/docker-compose.yml
Normal file
@@ -0,0 +1,62 @@
|
||||
# docker-compose.yml for Generator Current Ramp Controller Development
|
||||
#
|
||||
# Usage:
|
||||
# docker-compose build # Build the development image
|
||||
# docker-compose run --rm dev bash # Interactive shell
|
||||
# docker-compose run --rm dev python overload_detector.py # Run detector tests
|
||||
# docker-compose run --rm dev python ramp_controller.py # Run ramp tests
|
||||
#
|
||||
# Note: Full D-Bus integration requires a real Venus OS device.
|
||||
# This environment is for unit testing components.
|
||||
|
||||
version: '3.8'
|
||||
|
||||
services:
|
||||
dev:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile
|
||||
|
||||
volumes:
|
||||
# Mount source code for live editing
|
||||
- .:/app
|
||||
|
||||
# Mount test data if you have recorded samples
|
||||
# - ./test_data:/app/test_data
|
||||
|
||||
environment:
|
||||
- PYTHONPATH=/app:/app/ext/velib_python
|
||||
- PYTHONUNBUFFERED=1
|
||||
|
||||
# Keep container running for interactive use
|
||||
stdin_open: true
|
||||
tty: true
|
||||
|
||||
# Default command
|
||||
command: bash
|
||||
|
||||
# Optional: Mock D-Bus service for integration testing
|
||||
# This would require additional setup to mock Venus OS services
|
||||
mock-dbus:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile
|
||||
|
||||
volumes:
|
||||
- .:/app
|
||||
- dbus-socket:/var/run/dbus
|
||||
|
||||
environment:
|
||||
- DBUS_SYSTEM_BUS_ADDRESS=unix:path=/var/run/dbus/system_bus_socket
|
||||
|
||||
command: >
|
||||
bash -c "
|
||||
dbus-daemon --system --fork &&
|
||||
echo 'Mock D-Bus running' &&
|
||||
tail -f /dev/null
|
||||
"
|
||||
|
||||
privileged: true
|
||||
|
||||
volumes:
|
||||
dbus-socket:
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user