Skip to content

Data flow: XML-RPC, JSON-RPC, events, and data point updates

This document describes the end-to-end flow of data through aiohomematic for the two main transport mechanisms (XML-RPC and JSON-RPC), how incoming events are handled, and how data point values are updated and propagated within the system.

Audience: Contributors and integrators who need a precise understanding of message paths, responsibilities, and lifecycle of values and events.

Terminology: For definitions of Backend, Interface, Channel, Parameter, and Callback, see the Glossary.

Overview

  • Outbound reads/writes use the client layer (XML-RPC or JSON-RPC) to talk to the backend (CCU/Homegear).
  • Inbound events are pushed by the backend to aiohomematic’s local XML-RPC callback server run by Central.
  • Central updates the model (Device/Channel/DataPoint) and dynamic caches and notifies subscribers.

Key participants

  • CentralUnit (aiohomematic/central): orchestrates clients, runs XML-RPC callback server, stores caches, and hosts the runtime model.
  • Clients (aiohomematic/client): protocol adapters for XML-RPC (AioXmlRpcProxy) and JSON-RPC (AioJsonRpcAioHttpClient).
  • Model (aiohomematic/model): Device, Channel, DataPoints, Events; strictly no network I/O.
  • Store (aiohomematic/store): persistent descriptions, dynamic value/state caches, and diagnostic incident storage.

1. XML-RPC data flow

Purpose: Event callbacks from backend; many CCU operations can also be done via XML-RPC.

Outbound calls (read/write)

  1. A consumer reads a value: Central/Device/Channel delegates to Client.get_value(interface, address, paramset_key, parameter).
  2. Client (XML-RPC) calls AioXmlRpcProxy. towards the CCU/Homegear. Arguments are sanitized by _cleanup_args.
  3. The result is decoded and (if needed) converted in the model/support layer and returned to the caller. Central may store the value in the dynamic cache for the DataPointKey.
  4. A consumer writes a value: DataPoint.set_value(...) delegates to Device/Channel/Client. Client uses AioXmlRpcProxy to invoke setValue (or paramset writes), and may record a pending command in CommandTracker.

Inbound events (push from backend)

  1. Backend calls the local callback server started by Central (xml_rpc_server.AsyncXmlRpcServer), method AsyncRPCFunctions.event(interface_id, channel_address, parameter, value).
  2. AsyncRPCFunctions looks up the Central for interface_id and forwards the event via decorators to Central's event_coordinator.data_point_event(...).
  3. Central resolves the target DataPoint from (channel_address, parameter), converts value if needed, updates dynamic caches and the DataPoint's internal state.
  4. Central publishes events via the EventBus system (DataPointValueReceivedEvent, DeviceStateChangedEvent, etc.). Subscribers receive notifications through EventBus.subscribe(). Connection health metadata (PingPongTracker, last-seen timestamps) is updated. PingPongTracker records incidents to IncidentStore when mismatch thresholds are exceeded. Pending CommandTracker entries may be reconciled if the event confirms a write.
  5. If the event indicates structural changes (newDevices, deleteDevices, updateDevice, replaceDevice, readdedDevice), the respective AsyncRPCFunctions handlers forward to Central which triggers model updates (reload descriptions, add/remove devices/channels).

2) JSON-RPC data flow

Purpose: Alternative/optional transport for CCU JSON API and ReGa interactions (programs, system variables, metadata, values).

Outbound calls (read/write)

  1. Consumer requests go through InterfaceClient (with CcuBackend or JsonCcuBackend), which uses AioJsonRpcAioHttpClient.
  2. AioJsonRpcAioHttpClient ensures an authenticated session (login or renew). Requests are posted via _post/_do_post with method names defined in _JsonRpcMethod.
  3. Responses are parsed safely (_get_json_reponse) and converted to domain structures:
  4. Device details → Names, Rooms, Functions
  5. Device descriptions → DeviceDescription (via JsonCcuBackend)
  6. Paramset descriptions → ParameterData (via JsonCcuBackend)
  7. Program/system variable data → ProgramData/SystemVariableData
  8. Values (get/set) are converted via model.support.convert_value where needed.
  9. Dynamic caches in Central are updated. For writes, CommandTracker may record the pending state until a confirming callback (if provided via XML-RPC) or a subsequent read reconciles the value.

Inbound events

  • JSON-RPC interface typically does not deliver push events. Event callbacks still arrive via the XML-RPC callback server. Therefore, even when JSON-RPC is used for reads/writes/metadata, XML-RPC event handling remains the main push channel.

3. Event handling inside Central

Trigger points

  • AsyncRPCFunctions.event: raw value event.
  • AsyncRPCFunctions.newDevices/deleteDevices/updateDevice/replaceDevice/readdedDevice: topology/config events.
  • Error callback AsyncRPCFunctions.error: error conditions from the backend, forwarded as BackendSystemEvent for diagnostics.

Processing steps for a value event

  1. Identify DataPoint by channel_address and parameter.
  2. Convert incoming raw value to the DataPoint's typed value when necessary (e.g., boolean normalization, levels).
  3. Update in-memory value cache for DataPointKey and set modified/last-updated timestamps.
  4. Update DataPoint internal state and publish events to subscribers via EventBus:
  5. DataPointValueReceivedEvent: Notifies about value changes
  6. DeviceStateChangedEvent: Notifies about device state changes
  7. FirmwareStateChangedEvent: Notifies about firmware state changes
  8. Subscribers receive notifications through EventBus.subscribe() (e.g., subscribing to DataPointStateChangedEvent, DeviceStateChangedEvent).
  9. Reconcile pending commands (if the new value matches a recent write) and adjust connection health markers.

Error and edge cases

  • Unknown channel/parameter: Central can log a warning and optionally trigger a metadata refresh to discover new parameters.
  • Out-of-range/invalid payloads: values are validated and normalized by model.support and validators; unexpected fields are ignored with warnings.
  • Lost connection: Clients raise NoConnectionException; Central may attempt reconnects. Writes can be queued or fail fast based on configuration.

4. Data point update lifecycle (pull and push)

Pull (read)

  • A read through Client.get_value updates the dynamic cache and the DataPoint state on return. Consumers can read from the DataPoint without immediate I/O if cache is fresh.

Push (event)

  • An event updates the cache and DataPoint first; a subsequent read will return the updated value without backend I/O.

Write

  • DataPoint.send_value delegates to Client.set_value or Client.put_paramset. Values flow through a three-tier resolution model: Tier 1 (optimistic, set before RPC call for instant UI feedback), Tier 2 (unconfirmed, set after RPC for polling data points), and Tier 3 (confirmed, set on CCU event or polling read). The RPC call is wrapped by CommandRetryHandler.execute_with_retry(), which retries transient failures (timeouts, device unreachability, DutyCycle exhaustion) with exponential backoff. Non-idempotent data points (DpAction, DpButton) have _retryable=False and are not retried by default. See value_resolution.md for the full value resolution mechanism.

5. Sequence diagrams (Mermaid)

Additional sequence diagrams for connect, device discovery, and state change propagation are available in architecture/sequence_diagrams.md.

Read value via XML-RPC

sequenceDiagram
  actor App
  participant DP as DataPoint
  participant Dev as Device/Channel
  participant C as CentralUnit
  participant CX as InterfaceClient (XML-RPC)
  participant X as AioXmlRpcProxy
  App->>DP: read()
  DP->>Dev: delegate
  Dev->>C: get_value(address, key, param)
  C->>CX: get_value(...)
  CX->>X: getValue(...)
  X-->>CX: value
  CX-->>C: value
  C->>C: update dynamic cache
  C-->>Dev: value
  Dev-->>DP: value
  DP-->>App: value

Incoming event via XML-RPC

sequenceDiagram
  participant B as Backend (CCU/Homegear)
  participant X as AsyncXmlRpcServer
  participant C as CentralUnit
  participant M as Model (DataPoint)
  B-->>X: event(interface_id, channel_address, parameter, value)
  X->>C: data_point_event(...)
  C->>C: lookup DataPoint, convert value
  C->>M: update state
  C->>C: update caches, health, reconcile commands
  C-->>App: notify subscribers

Write value via XML-RPC

sequenceDiagram
  actor App
  participant DP as DataPoint
  participant Dev as Device/Channel
  participant C as CentralUnit
  participant CX as InterfaceClient (XML-RPC)
  participant X as AioXmlRpcProxy
  App->>DP: set_value(v)
  DP->>Dev: delegate
  Dev->>C: set_value(...)
  C->>CX: set_value(...)
  CX->>X: setValue(...)
  X-->>CX: ok
  CX-->>C: ok
  C->>C: update CommandTracker (pending)
  note over C: Event expected to confirm

6. Where to look in code

  • XML-RPC server: aiohomematic/central/rpc_server.py
  • AsyncRPCFunctions.event/newDevices/... and AsyncXmlRpcServer lifecycle
  • XML-RPC client: aiohomematic/client/rpc_proxy.py
  • AioAioXmlRpcProxy, supported methods, async request handling
  • JSON-RPC client: aiohomematic/client/json_rpc.py
  • AioJsonRpcAioHttpClient, login/session, methods, conversions
  • Client: aiohomematic/client/interface_client.py and aiohomematic/client/backends/
  • InterfaceClient (unified client), CcuBackend, JsonCcuBackend, HomegearBackend
  • Model and updates: aiohomematic/model/device.py and subpackages
  • Device/Channel, DataPoint types, value cache, update flow
  • Events: aiohomematic/model/event.py

Notes

  • Tests under tests/ verify many of these flows; see tests/test_central.py, tests/test_client_json_rpc_client.py, tests/test_central_event_bus.py, tests/test_central_event_coordinator.py, and device/data point related tests.