This commit is contained in:
2026-03-26 18:23:55 +00:00
parent 0ef05034b0
commit 6028fd37e0
20 changed files with 275 additions and 103 deletions

111
WATCHDOG_DEBUG_REPORT.md Normal file
View File

@@ -0,0 +1,111 @@
# Watchdog Reboot Debug Report - 2026-03-26
## Problem
Cerbo GX (einstein) triggered watchdog reboot on Mar 24 20:13:43 due to sustained high load average (11/9/7) exceeding thresholds (0/10/6).
## Root Cause
Cumulative CPU pressure from **7 custom Python D-Bus services** plus **dbus-raymarine-publisher** (multicast decoder) running simultaneously on a dual-core ARM Cortex-A7.
### Services Running at Time of Reboot
1. **dbus-anchor-alarm** - 1 Hz, 10 D-Bus reads/sec, circle fitting on 1800 points, JSON serializing 5000 track points every 5s
2. **dbus-generator-ramp** - 2 Hz (500ms), multiple D-Bus reads + regression math
3. **dbus-tides** - 1 Hz, SQLite writes + harmonic calculations
4. **dbus-meteoblue-forecast** - periodic HTTP API calls
5. **dbus-no-foreign-land** - periodic GPS uploads
6. **dbus-windy-station** - periodic sensor uploads
7. **dbus-raymarine-publisher** - continuous multicast protobuf decoding (12% CPU sustained)
### Key Findings
- **dbus-daemon**: 13% CPU (bottleneck from ~20 services making synchronous GetValue() calls)
- **dbus-raymarine-publisher**: 12% CPU (multiple threads continuously parsing multicast packets)
- **Total Python CPU**: ~50% aggregate across all custom services
- **Memory**: OK (609MB available of 1GB)
- **No crash loops**: All services had 152K+ second uptimes
## Optimizations Applied (v2.1.0)
### 1. dbus-generator-ramp
- **Changed**: Main loop from 500ms → 1000ms (2Hz → 1Hz)
- **File**: `dbus-generator-ramp/config.py` line 257
- **Impact**: 50% reduction in D-Bus polling and math operations
- **Version**: 2.0.0 → 2.1.0
### 2. dbus-anchor-alarm
- **Changed**: JSON update interval from 5s → 20s
- **File**: `dbus-anchor-alarm/anchor_alarm.py` line 78
- **Impact**: 75% reduction in large JSON serializations
- **Changed**: Track buffer from 5000 → 2000 points
- **File**: `dbus-anchor-alarm/track_buffer.py` line 16
- **Impact**: 60% less data to serialize and transmit over MQTT
- **Version**: 2.0.0 → 2.1.0
## Load Average Results
**Before optimizations:**
```
14:40:30 load average: 1.04, 2.30, 2.71
14:44:10 load average: 3.91, 2.50, 2.65 (after anchor-alarm restarted)
14:52:37 load average: 1.29, 3.93, 3.77
14:55:10 load average: 7.04, 6.21, 4.76 (critical)
```
**After optimizations (v2.1.0):**
```
15:05:21 load average: 1.69, 3.95, 4.24
15:06:01 load average: 0.99, 3.48, 4.07
15:06:41 load average: 0.64, 3.08, 3.91
15:07:42 load average: 1.35, 2.87, 3.78 (trending down)
```
**Status**: 15-minute load declining from 4.76 → 3.78, should continue dropping below watchdog threshold (6.0) over next 15 minutes.
## Remaining Concerns
### High-Risk Service: dbus-raymarine-publisher (12% sustained CPU)
- Continuous multicast parsing with multiple threads
- Running at 1Hz D-Bus update but packet decoding is continuous
- **Recommendation**: Monitor this service closely; consider adding `--update-interval 2000` (2Hz → 0.5Hz) if load remains elevated
### System-Wide D-Bus Pressure
- `dbus-daemon` at 13% CPU indicates bus saturation
- 20+ services making synchronous calls
- **Future optimization**: Implement D-Bus signal subscriptions instead of polling where possible
## Monitoring Commands
Check load every minute:
```bash
ssh cerbo "watch -n 60 uptime"
```
Monitor Python service CPU:
```bash
ssh cerbo "while true; do top -b -n 1 | grep python3 | head -n 10; sleep 30; done"
```
Check service health:
```bash
ssh cerbo "svstat /service/dbus-* 2>/dev/null | grep -v 'up.*seconds'"
```
## Next Steps if Load Remains High
1. Reduce raymarine publisher update rate to 2000ms
2. Consider disabling debug logging on anchor-alarm (SQLite writes every 15s)
3. Evaluate if all 7 services need to run continuously (some could be on-demand)
4. Long-term: consolidate low-frequency services (meteoblue, windy, nfl) into a single process
## Files Modified
- `dbus-generator-ramp/config.py` (main_loop_interval_ms: 500 → 1000)
- `dbus-generator-ramp/dbus-generator-ramp.py` (VERSION: 2.0.0 → 2.1.0)
- `dbus-generator-ramp/build-package.sh` (VERSION: 1.0.0 → 2.1.0)
- `dbus-anchor-alarm/config.py` (VERSION: 2.0.0 → 2.1.0)
- `dbus-anchor-alarm/anchor_alarm.py` (_JSON_UPDATE_INTERVAL_SEC: 5.0 → 20.0)
- `dbus-anchor-alarm/track_buffer.py` (MAX_POINTS: 5000 → 2000)
- `dbus-anchor-alarm/build-package.sh` (VERSION: 2.0.0 → 2.1.0)
## Deployed Packages
- `dbus-generator-ramp-2.1.0.tar.gz` (installed and running)
- `dbus-anchor-alarm-2.1.0.tar.gz` (installed and running)

View File

@@ -24,7 +24,7 @@ set -e
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
PROJECT_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
VERSION="1.0.0"
VERSION="1.1.0"
OUTPUT_DIR="$SCRIPT_DIR"
PACKAGE_NAME="dbus-raymarine-publisher"

View File

@@ -20,7 +20,7 @@ Example usage:
print(sentence)
"""
__version__ = "1.0.0"
__version__ = "1.1.0"
__author__ = "Axiom NMEA Project"
# Core protocol components

View File

@@ -9,7 +9,7 @@ import threading
import time
from dataclasses import dataclass, field
from datetime import datetime
from typing import Dict, Any, Optional
from typing import Any, ClassVar, Dict, Optional
from ..sensors import (
get_tank_name,
@@ -88,6 +88,17 @@ class SensorData:
# Thread safety
_lock: threading.Lock = field(default_factory=threading.Lock)
_TIME_ATTRS: ClassVar[dict] = {
'gps': 'gps_time',
'heading': 'heading_time',
'wind': 'wind_time',
'depth': 'depth_time',
'temp': 'temp_time',
'pressure': 'pressure_time',
'tank': 'tank_time',
'battery': 'battery_time',
}
@property
def depth_ft(self) -> Optional[float]:
"""Get depth in feet."""
@@ -127,38 +138,34 @@ class SensorData:
Args:
decoded: DecodedData object from RaymarineDecoder.decode()
"""
# Import here to avoid circular import
from ..protocol.decoder import DecodedData
now = time.time()
has_data = False
with self._lock:
self.packet_count += 1
if decoded.has_data():
self.decode_count += 1
# Update GPS
if decoded.latitude is not None and decoded.longitude is not None:
self.latitude = decoded.latitude
self.longitude = decoded.longitude
self.gps_time = now
has_data = True
# Update heading
if decoded.heading_deg is not None:
self.heading_deg = decoded.heading_deg
self.heading_time = now
has_data = True
# Update COG/SOG
if decoded.cog_deg is not None:
self.cog_deg = decoded.cog_deg
has_data = True
if decoded.sog_kts is not None:
self.sog_kts = decoded.sog_kts
has_data = True
# Update wind
if (decoded.twd_deg is not None or decoded.tws_kts is not None or
decoded.aws_kts is not None):
self.wind_time = now
has_data = True
if decoded.twd_deg is not None:
self.twd_deg = decoded.twd_deg
if decoded.tws_kts is not None:
@@ -168,33 +175,36 @@ class SensorData:
if decoded.aws_kts is not None:
self.aws_kts = decoded.aws_kts
# Update depth
if decoded.depth_m is not None:
self.depth_m = decoded.depth_m
self.depth_time = now
has_data = True
# Update temperature
if decoded.water_temp_c is not None or decoded.air_temp_c is not None:
self.temp_time = now
has_data = True
if decoded.water_temp_c is not None:
self.water_temp_c = decoded.water_temp_c
if decoded.air_temp_c is not None:
self.air_temp_c = decoded.air_temp_c
# Update pressure
if decoded.pressure_mbar is not None:
self.pressure_mbar = decoded.pressure_mbar
self.pressure_time = now
has_data = True
# Update tanks
if decoded.tanks:
self.tanks.update(decoded.tanks)
self.tank_time = now
has_data = True
# Update batteries
if decoded.batteries:
self.batteries.update(decoded.batteries)
self.battery_time = now
has_data = True
if has_data:
self.decode_count += 1
def to_dict(self) -> Dict[str, Any]:
"""Convert to dictionary for JSON serialization.
@@ -264,18 +274,11 @@ class SensorData:
Returns:
Age in seconds, or None if no data has been received
"""
time_map = {
'gps': self.gps_time,
'heading': self.heading_time,
'wind': self.wind_time,
'depth': self.depth_time,
'temp': self.temp_time,
'pressure': self.pressure_time,
'tank': self.tank_time,
'battery': self.battery_time,
}
attr = self._TIME_ATTRS.get(data_type)
if attr is None:
return None
with self._lock:
ts = time_map.get(data_type, 0)
ts = getattr(self, attr)
if ts == 0:
return None
return time.time() - ts

View File

@@ -7,13 +7,17 @@ incoming Raymarine packets.
import socket
import struct
import time
import threading
from typing import List, Tuple, Optional, Callable
from ..protocol.decoder import RaymarineDecoder, DecodedData
from ..protocol.constants import HEADER_SIZE
from ..data.store import SensorData
from ..sensors import MULTICAST_GROUPS
_MIN_PACKET_SIZE = HEADER_SIZE + 20
class MulticastListener:
"""Listens on Raymarine multicast groups for sensor data.
@@ -48,6 +52,7 @@ class MulticastListener:
groups: Optional[List[Tuple[str, int]]] = None,
on_packet: Optional[Callable[[bytes, str, int], None]] = None,
on_decode: Optional[Callable[[DecodedData], None]] = None,
min_decode_interval: float = 0.2,
):
"""Initialize the multicast listener.
@@ -61,6 +66,11 @@ class MulticastListener:
Called with (packet_bytes, group, port).
on_decode: Optional callback for each decoded packet.
Called with DecodedData object.
min_decode_interval: Minimum seconds between decodes (0 to
decode every packet). Packets arriving
faster than this are drained from the
socket but not decoded. Default 200ms
gives 5 decodes per second.
"""
self.decoder = decoder
self.sensor_data = sensor_data
@@ -68,6 +78,7 @@ class MulticastListener:
self.groups = groups or MULTICAST_GROUPS
self.on_packet = on_packet
self.on_decode = on_decode
self.min_decode_interval = min_decode_interval
self.running = False
self.sockets: List[socket.socket] = []
@@ -113,26 +124,37 @@ class MulticastListener:
def _listen(self, sock: socket.socket, group: str, port: int) -> None:
"""Listen on a socket and decode packets.
Packets smaller than the minimum protobuf size are discarded.
When min_decode_interval > 0, intermediate packets are drained
from the socket without decoding to reduce CPU.
Args:
sock: UDP socket
group: Multicast group (for logging)
port: Port number (for logging)
"""
last_decode = 0.0
throttle = self.min_decode_interval
while self.running:
try:
data, addr = sock.recvfrom(65535)
# Optional packet callback
if self.on_packet:
self.on_packet(data, group, port)
# Decode the packet
result = self.decoder.decode(data)
if len(data) < _MIN_PACKET_SIZE:
continue
# Update sensor data
if throttle > 0:
now = time.monotonic()
if now - last_decode < throttle:
continue
last_decode = now
result = self.decoder.decode(data)
self.sensor_data.update(result)
# Optional decode callback
if self.on_decode and result.has_data():
self.on_decode(result)

View File

@@ -112,20 +112,22 @@ class NMEAGenerator:
"""
sentences = []
now = datetime.utcnow()
time_str = now.strftime("%H%M%S.00")
date_str = now.strftime("%d%m%y")
# GPS sentences
if self.is_enabled('gga'):
s = self.generate_gga(data, now)
s = self.generate_gga(data, time_str)
if s:
sentences.append(s)
if self.is_enabled('gll'):
s = self.generate_gll(data, now)
s = self.generate_gll(data, time_str)
if s:
sentences.append(s)
if self.is_enabled('rmc'):
s = self.generate_rmc(data, now)
s = self.generate_rmc(data, time_str, date_str)
if s:
sentences.append(s)
@@ -211,7 +213,7 @@ class NMEAGenerator:
def generate_gga(
self,
data: SensorData,
time: Optional[datetime] = None
time_str: Optional[str] = None
) -> Optional[str]:
"""Generate GGA (GPS Fix Data) sentence."""
with data._lock:
@@ -224,14 +226,14 @@ class NMEAGenerator:
sentence = GGASentence(
latitude=lat,
longitude=lon,
time=time,
time=time_str,
)
return sentence.to_nmea()
def generate_gll(
self,
data: SensorData,
time: Optional[datetime] = None
time_str: Optional[str] = None
) -> Optional[str]:
"""Generate GLL (Geographic Position) sentence."""
with data._lock:
@@ -244,14 +246,15 @@ class NMEAGenerator:
sentence = GLLSentence(
latitude=lat,
longitude=lon,
time=time,
time=time_str,
)
return sentence.to_nmea()
def generate_rmc(
self,
data: SensorData,
time: Optional[datetime] = None
time_str: Optional[str] = None,
date_str: Optional[str] = None,
) -> Optional[str]:
"""Generate RMC (Recommended Minimum) sentence."""
with data._lock:
@@ -266,7 +269,8 @@ class NMEAGenerator:
sentence = RMCSentence(
latitude=lat,
longitude=lon,
time=time,
time=time_str,
date=date_str,
sog=sog,
cog=cog,
mag_var=self.mag_variation,

View File

@@ -62,8 +62,8 @@ class NMEASentence(ABC):
Two-character hex checksum
"""
checksum = 0
for char in sentence:
checksum ^= ord(char)
for byte in sentence.encode('ascii'):
checksum ^= byte
return f"{checksum:02X}"
def to_nmea(self) -> Optional[str]:

View File

@@ -6,7 +6,6 @@ GLL - Geographic Position
RMC - Recommended Minimum Navigation Information
"""
from datetime import datetime
from typing import Optional
from ..sentence import NMEASentence
@@ -45,7 +44,7 @@ class GGASentence(NMEASentence):
self,
latitude: Optional[float] = None,
longitude: Optional[float] = None,
time: Optional[datetime] = None,
time: Optional[str] = None,
quality: int = 1,
num_satellites: int = 8,
hdop: float = 1.0,
@@ -57,7 +56,7 @@ class GGASentence(NMEASentence):
Args:
latitude: Latitude in decimal degrees
longitude: Longitude in decimal degrees
time: UTC time, or None for current time
time: Pre-formatted NMEA time string (HHMMSS.SS), or None for current time
quality: GPS quality indicator (1=GPS, 2=DGPS)
num_satellites: Number of satellites in use
hdop: Horizontal dilution of precision
@@ -78,7 +77,7 @@ class GGASentence(NMEASentence):
if self.latitude is None or self.longitude is None:
return None
time_str = self.format_time(self.time)
time_str = self.time if self.time is not None else self.format_time()
lat_str = self.format_latitude(self.latitude)
lon_str = self.format_longitude(self.longitude)
@@ -133,7 +132,7 @@ class GLLSentence(NMEASentence):
self,
latitude: Optional[float] = None,
longitude: Optional[float] = None,
time: Optional[datetime] = None,
time: Optional[str] = None,
status: str = "A",
mode: str = "A",
):
@@ -142,7 +141,7 @@ class GLLSentence(NMEASentence):
Args:
latitude: Latitude in decimal degrees
longitude: Longitude in decimal degrees
time: UTC time, or None for current time
time: Pre-formatted NMEA time string (HHMMSS.SS), or None for current time
status: Status (A=valid, V=invalid)
mode: Mode indicator (A=autonomous, D=differential)
"""
@@ -159,7 +158,7 @@ class GLLSentence(NMEASentence):
lat_str = self.format_latitude(self.latitude)
lon_str = self.format_longitude(self.longitude)
time_str = self.format_time(self.time)
time_str = self.time if self.time is not None else self.format_time()
return f"{lat_str},{lon_str},{time_str},{self.status},{self.mode}"
@@ -195,7 +194,8 @@ class RMCSentence(NMEASentence):
self,
latitude: Optional[float] = None,
longitude: Optional[float] = None,
time: Optional[datetime] = None,
time: Optional[str] = None,
date: Optional[str] = None,
status: str = "A",
sog: Optional[float] = None,
cog: Optional[float] = None,
@@ -207,7 +207,8 @@ class RMCSentence(NMEASentence):
Args:
latitude: Latitude in decimal degrees
longitude: Longitude in decimal degrees
time: UTC time, or None for current time
time: Pre-formatted NMEA time string (HHMMSS.SS), or None for current time
date: Pre-formatted NMEA date string (DDMMYY), or None for current date
status: Status (A=valid, V=warning)
sog: Speed over ground in knots
cog: Course over ground in degrees true
@@ -217,6 +218,7 @@ class RMCSentence(NMEASentence):
self.latitude = latitude
self.longitude = longitude
self.time = time
self.date = date
self.status = status
self.sog = sog
self.cog = cog
@@ -228,9 +230,8 @@ class RMCSentence(NMEASentence):
if self.latitude is None or self.longitude is None:
return None
dt = self.time if self.time else datetime.utcnow()
time_str = self.format_time(dt)
date_str = self.format_date(dt)
time_str = self.time if self.time is not None else self.format_time()
date_str = self.date if self.date is not None else self.format_date()
lat_str = self.format_latitude(self.latitude)
lon_str = self.format_longitude(self.longitude)

View File

@@ -7,7 +7,6 @@ into structured Python objects.
from typing import Dict, List, Optional, Any
from dataclasses import dataclass, field as dc_field
import time
from .parser import ProtobufParser, ProtoField
from .constants import (
@@ -36,7 +35,7 @@ from .constants import (
)
@dataclass
@dataclass(slots=True)
class DecodedData:
"""Container for decoded sensor values from a single packet.
@@ -74,9 +73,6 @@ class DecodedData:
# Batteries: dict of battery_id -> voltage
batteries: Dict[int, float] = dc_field(default_factory=dict)
# Decode timestamp
timestamp: float = dc_field(default_factory=time.time)
def has_data(self) -> bool:
"""Check if any data was decoded."""
return (
@@ -109,6 +105,19 @@ class RaymarineDecoder:
print(f"GPS: {result.latitude}, {result.longitude}")
"""
_COLLECT_REPEATED = frozenset({14, 16, 20})
_PARSE_NESTED = frozenset({
Fields.GPS_POSITION, # 2
Fields.HEADING, # 3
Fields.SOG_COG, # 5
Fields.DEPTH, # 7
Fields.WIND_NAVIGATION, # 13
Fields.ENGINE_DATA, # 14
Fields.TEMPERATURE, # 15
Fields.TANK_DATA, # 16
Fields.HOUSE_BATTERY, # 20
})
def __init__(self, verbose: bool = False):
"""Initialize the decoder.
@@ -128,19 +137,16 @@ class RaymarineDecoder:
"""
result = DecodedData()
# Need at least header + some protobuf data
if len(packet) < HEADER_SIZE + 20:
return result
# Skip fixed header, protobuf starts at offset 0x14
proto_data = packet[HEADER_SIZE:]
# Parse protobuf - collect repeated fields:
# Field 14 = Engine data (contains battery voltage at 14.3.4)
# Field 16 = Tank data
# Field 20 = House battery data
parser = ProtobufParser(proto_data)
fields = parser.parse_message(collect_repeated={14, 16, 20})
fields = parser.parse_message(
collect_repeated=self._COLLECT_REPEATED,
parse_nested=self._PARSE_NESTED,
)
if not fields:
return result

View File

@@ -62,10 +62,11 @@ class ProtobufParser:
"""
self.data = data
self.pos = 0
self._len = len(data)
def remaining(self) -> int:
"""Return number of unread bytes."""
return len(self.data) - self.pos
return self._len - self.pos
def read_varint(self) -> int:
"""Decode a variable-length integer.
@@ -81,15 +82,19 @@ class ProtobufParser:
"""
result = 0
shift = 0
while self.pos < len(self.data):
byte = self.data[self.pos]
self.pos += 1
data = self.data
pos = self.pos
end = self._len
while pos < end:
byte = data[pos]
pos += 1
result |= (byte & 0x7F) << shift
if not (byte & 0x80):
break
shift += 7
if shift > 63:
break
self.pos = pos
return result
def read_fixed64(self) -> bytes:
@@ -127,13 +132,19 @@ class ProtobufParser:
def parse_message(
self,
collect_repeated: Optional[Set[int]] = None
collect_repeated: Optional[Set[int]] = None,
parse_nested: Optional[Set[int]] = None,
) -> Dict[int, Any]:
"""Parse all fields in a protobuf message.
Args:
collect_repeated: Set of field numbers to collect as lists.
Use this for repeated fields like tanks, batteries.
parse_nested: Set of field numbers whose length-delimited values
should be speculatively parsed as nested messages.
If None, all length-delimited fields are tried (legacy
behavior). Pass an explicit set to skip expensive
recursive parsing on fields you don't need.
Returns:
Dictionary mapping field numbers to ProtoField objects.
@@ -142,22 +153,17 @@ class ProtobufParser:
fields: Dict[int, Any] = {}
if collect_repeated is None:
collect_repeated = set()
filter_nested = parse_nested is not None
while self.pos < len(self.data):
if self.remaining() < 1:
break
while self.pos < self._len:
try:
# Read tag: (field_number << 3) | wire_type
tag = self.read_varint()
field_num = tag >> 3
wire_type = tag & 0x07
# Validate field number
if field_num == 0 or field_num > 536870911:
break
# Read value based on wire type
if wire_type == WIRE_VARINT:
value = self.read_varint()
elif wire_type == WIRE_FIXED64:
@@ -171,30 +177,26 @@ class ProtobufParser:
break
value = self.read_fixed32()
else:
# Unknown wire type (3, 4 are deprecated)
break
# For length-delimited, try to parse as nested message
children: Dict[int, ProtoField] = {}
if wire_type == WIRE_LENGTH and len(value) >= 2:
try:
nested_parser = ProtobufParser(value)
children = nested_parser.parse_message()
# Only keep if we parsed most of the data
if nested_parser.pos < len(value) * 0.5:
if not filter_nested or field_num in parse_nested:
try:
nested_parser = ProtobufParser(value)
children = nested_parser.parse_message()
if nested_parser.pos < len(value) * 0.5:
children = {}
except Exception:
children = {}
except Exception:
children = {}
pf = ProtoField(field_num, wire_type, value, children)
# Handle repeated fields - collect as list
if field_num in collect_repeated:
if field_num not in fields:
fields[field_num] = []
fields[field_num].append(pf)
else:
# Keep last occurrence for non-repeated fields
fields[field_num] = pf
except (IndexError, struct.error):

View File

@@ -125,7 +125,9 @@ class BatteryService(VeDbusServiceBase):
product_id = 0xA143 # Custom product ID for Raymarine Battery
# Maximum age in seconds before data is considered stale
MAX_DATA_AGE = 30.0
# Increased to 60s since Raymarine doesn't broadcast battery values frequently
# and battery voltage doesn't change rapidly
MAX_DATA_AGE = 60.0
def __init__(
self,

View File

@@ -115,6 +115,7 @@ class VeDbusServiceBase(ABC):
self._bus = None # Private D-Bus connection for this service
self._paths: Dict[str, Any] = {}
self._registered = False
self._ctx = None
@property
def service_name(self) -> str:
@@ -244,9 +245,18 @@ class VeDbusServiceBase(ABC):
return False
try:
self._update()
if self._dbusservice:
with self._dbusservice as s:
self._ctx = s
try:
self._update()
finally:
self._ctx = None
else:
self._update()
return True
except Exception as e:
self._ctx = None
logger.error(f"Error updating {self.service_name}: {e}")
return True # Keep trying
@@ -257,7 +267,11 @@ class VeDbusServiceBase(ABC):
path: The D-Bus path (e.g., '/Position/Latitude')
value: The value to set
"""
if self._dbusservice and self._registered:
s = self._ctx
if s is not None:
if path in s:
s[path] = value
elif self._dbusservice and self._registered:
with self._dbusservice as s:
if path in s:
s[path] = value
@@ -271,6 +285,11 @@ class VeDbusServiceBase(ABC):
Returns:
The current value, or None if not found
"""
s = self._ctx
if s is not None:
if path in s:
return s[path]
return None
if self._dbusservice and self._registered:
with self._dbusservice as s:
if path in s:

View File

@@ -93,7 +93,9 @@ class TankService(VeDbusServiceBase):
product_id = 0xA142 # Custom product ID for Raymarine Tank
# Maximum age in seconds before data is considered stale
MAX_DATA_AGE = 30.0
# Increased to 60s since Raymarine doesn't broadcast tank values frequently
# and tank levels don't change rapidly (< 1x/minute)
MAX_DATA_AGE = 60.0
def __init__(
self,

View File

@@ -75,7 +75,7 @@ from debug_logger import DebugLogger
logger = logging.getLogger('dbus-anchor-alarm')
_PEAK_LOAD_WINDOW_SEC = 60.0
_JSON_UPDATE_INTERVAL_SEC = 5.0
_JSON_UPDATE_INTERVAL_SEC = 20.0
_SLOW_TICK_DIVISOR = 15
_WATCHDOG_INTERVAL_SEC = 60.0
_WATCHDOG_WARN_MB = 25

View File

@@ -11,7 +11,7 @@ set -e
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
VERSION="2.0.0"
VERSION="2.1.0"
OUTPUT_DIR="$SCRIPT_DIR"
PACKAGE_NAME="dbus-anchor-alarm"

View File

@@ -11,7 +11,7 @@ FIRMWARE_VERSION = 0
CONNECTED = 1
# Version
VERSION = '2.0.0'
VERSION = '2.1.0'
# Timing
MAIN_LOOP_INTERVAL_MS = 1000

View File

@@ -13,7 +13,7 @@ logger = logging.getLogger('dbus-anchor-alarm.track_buffer')
EARTH_RADIUS_FT = 20902231.0
DEDUP_THRESHOLD_FT = 7.0
MAX_POINTS = 5000
MAX_POINTS = 2000
TRIM_FRACTION = 0.20

View File

@@ -29,7 +29,7 @@ set -e
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
# Default values
VERSION="1.0.0"
VERSION="2.1.0"
OUTPUT_DIR="$SCRIPT_DIR"
PACKAGE_NAME="dbus-generator-ramp"

View File

@@ -254,7 +254,7 @@ LOGGING_CONFIG = {
TIMING_CONFIG = {
# Main loop interval (milliseconds)
# This controls how often we check state and update
'main_loop_interval_ms': 500,
'main_loop_interval_ms': 1000,
# Timeout for D-Bus operations (seconds)
'dbus_timeout': 5,

View File

@@ -55,7 +55,7 @@ from ramp_controller import RampController
# Version
VERSION = '2.0.0'
VERSION = '2.1.0'
# D-Bus service name for our addon
SERVICE_NAME = 'com.victronenergy.generatorramp'