Event Data Platform

Digital Twin enabled Digital Ecosystems
Published by Iotics

Features

  • Event-Data Platform
  • Data Integration
  • Digital Data Twins
  • Selective, Secure Data Sharing
  • Event Analytics
  • Automated scalability

Event-Data Platform

As companies seek to become 'data centric' there is a need to embrace an event driven architecture. This requires harvesting events across the IT landscape emerging from legacy systems, from near real time and static data sources within and outside the company boundaries and integration with micro services where deployed. An event-data platform enables this through providing an architectural overlay, creating intelligent data and streams of events combined with selective sharing of event-data.
Architectural Overlay via integrators and digital data twins
  • The platform is fed by integrators that are capable of accessing data from any source, near real-time or static and converting these into events and securely passing these to a digital twin in the platform. It is via the twin that events are pushed to the twins of applications
Intelligent Data from Digital Data Twins
  • A Digital Data Twin is a virtual representation of an asset, process, system or person. It includes all metadata, data, controls, state changes and relationships collected by the real-world asset during its life, in near real-time. The internal, external, private and public data sources feed data and events to the digital twin to create a richer, more informed, unified version of the valued asset. The information gathered in near real-time, from various sources contribute to the virtual entity: The Digital Data Twins act as either a source of events or a follower of events, depending on whether data is being shared or control exercised, a single twin can perform either function. It is possible to aggregate events from twins into a synthesiser twin This “Synthesis” enriches and contextualises the other data from twins and allows the ecosystem to grow organically, flexibly and securely without the need to re-architect..
Intelligent Data and Selective Sharing
  • Semantic descriptions are given to event data via the integrators. It is this meta data that is then used to determine how the events are managed via the platform and then served to applications needed the information. It is the use of the semantics that enables twins to recognise data sources and followers to which it needs to be bind to enable secure interactions. It this that enables the automation within the platform, avoiding the neeed for rearchitecting the platform
Event Analytics via Event Stream and Event Log
  • The Event Stream is the historical record of meaningful events in a twin’s life. The events are stored in an immutable database and are available for any allowed follower to replay from a point in time – the Event Sourcing technique. Events and data from the twins is a useful resource for any application to analyse.
Intelligent Data from Digital Data Twins
  • The Digital Data Twin The cache updates when new events flow from the twin. The follower analyses the event and data, contextualises them with other sources and provides feeds for other twins in the ecosystem. This “Synthesis” enriches and contextualises the other data from twins and allows the ecosystem to grow organically, flexibly and securely.

Data Integration

It is relatively straight forward to integrate a data source from any endpoint. Integrations can be leveraged from the existing integration library or customised privately. The data is contextualised via semantic descriptions, to map to any system, industry or global standard, allowing scalability, development and growth without standard restrictions. The data is contextualised to map to any system, industry or global standard, allowing scalability, development and growth without standard restrictions.
Data Integration
Patented and a completely secure model
  • Integrate private, public or 3rd party sources to platform using light weight code via existing APIs. The integrators feed events securely to data twin managed by the platform.
  • The Integrator does not expose full data sets available, only events relevant to use case,

Digital Data Twins

A Digital Data Twin is a virtual representation of an asset, document, process, system or person. It includes all metadata, data, controls, state changes and relationships collected by the real-world asset during its life, in near real-time. The internal, external, private and public data sources feed data and events to the digital twin to create a richer, more informed, unified version of the valued asset. The information gathered in near real-time, from various sources contribute to the virtual entity: The Digital Data Twin. The twin presents 4 capabilities • Metadata • A control interface • Fast-moving near real-time data • Historical events and state changes
Digital Data Twins
Metadata
  • Static or very slow-moving data about the twin is stored in a semantic format: for example: serial number and classification, relation to other twins.
Control Interface
  • Enables other twins to send instructions and commands (assuming given permission) to a twin which will be passed to the Asset/Thing which has been abstracted through the twin
Fast-moving near real-time data
  • Flows through the twin to the consumer. Near real-time data should not be stored, it is up to the consumer to do that.
Historical events and state changes
  • A log of significant events over the life of the asset, such as being built, sold or moved, alerts about conditions and relevant changes. Stored in an immutable database. The event log can be replayed to the consumer on request.
Digital Data Twins of Assets
  • Physical assets in the real world, such as engines, have near real-time data about their current functioning, such as: temperatures, pressures and RPM. Engines also have event-based data such as maintenance schedules, and static metadata like their serial number, version and classification. The twin and its event stream present this data and metadata in a normalised way to potential consumers.
Digital Data Twins of Environments
  • Environments have metadata about their place in the world (geolocation, place names, opening times), near real-time data about footfall, internal climate, power consumption and event-based data such as change of ownership, hosted events and calendar entries.
Digital Data Twins of Individuals
  • People have metadata about themselves, names, ID, and relationships to other people, they have fast-moving data such as pulse rate and blood pressure. They also have events such as marriage, movement, joining or leaving a company and credit histories. Digital twins can request to receive data streams (either events or near real-time) from other twins. The twin holding the data can accept or deny this request. This is the model of “brokered interaction”. Interactions can be formed between twins in separate environments belonging to different enterprises, thus breaking down the siloed approach to storing data within and without corporate boundaries and creating the digital ecosystem.

Selective, Secure Data Sharing

Brokered, fine grained interactions between digital data twins allow selective and permission-based sharing of data. These relationships enable the construction of digital ecosystems that securely flow data across corporate boundaries in near real-time between digital twins. The interactions can be transient or permanent to mirror the requirements of digital ecosystem participants. Brokered, secure data sharing across customer, supplier and partner boundaries is critical for business agility, scalability and new revenue realisations.
Selective, Secure Data Sharing
Abstraction
  • Each twin is an abstraction of something (physical or otherwise) in the real world. Attack surfaces are on the virtual twins only, protecting the real thing from malicious activity such as Denial of Service attacks.
Access Control and Event-Driven Model
  • Access Control Lists (ACL) govern who is allowed to receive data or send control instructions to and from the twin. All potential consumers of a twin’s data must first request to “follow” the publishing twin’s feed. The request isn’t direct, and instead is sent to Iotics’ environment where the ACL is evaluated and, if acceptable, a subscription is granted to the requestor. Iotics’ environment was designed from the beginning to be event driven with all APIs being “push” rather than “pull”. Any twin who wants any data must first request to follow a twin (which can be denied by ACL) and then they have to wait to be sent the event when the twins predefined rules are met. There is no “GET of death” DoS attack possible, as there are no GETs.
Cryptography
  • There are two kinds of cryptography employed in Iotics’ environment, Encryption and Signing. • Encryption follows industry-standard approaches where TLS 1.2 is used for all messages (via AMQPS) and for host to host communication. • Signing is employed in all messages from an endpoint into the system. To prevent spoofing and replay attacks, all messages have sequence numbers and are doubly-hashed using the Hash Machine Authentication Code (HMAC) approach. HMAC means that messages are tamper-resistant and can be reliably traced back to the sending endpoint.
Visibility
  • In order to bind to a twin in the find-and-bind model, the twin must be found. A twin’s owner can set the visibility of a twin so that it will not return in any searches. If it cannot be found, it cannot be attacked.

Event Analytics

Near real-time event analytics stream across brokered interactions between digital twins. The out-of-the-box capability streams only meaningful event data without polling or storing huge amounts of data. Digital twins can be configured to recognise significant events and respond in near real-time. Event analytics streaming eliminates the need for monolithic data lakes, optimises networks without data flooding and offers machine learning and AI opportunities.
Event Analytics
Data Driven Analytics
  • Event Analytics within Event-Data Platform is defined as the processing of near real-time, event-based data to create additional meaningful events. The new events are fed back to digital data twins in the operating environment where they are stored in event logs and shared with followers and applications. Applications can display information in dashboards (visual analytics) or use the data to formulate predictions based on a model (predictive analytics). All of these forms of analytics are under the blanket term “event analytics”. The Iotics technology streams event analytics to enable the use of other systems
Basic Event Analytics
  • A basic example of event analytics is checking a value within predefined limits (i.e. a river level). If the value is outside the predefined range then an event is created and sent to any twins which are following it. (i.e. a boat). Followers will receive this new event and react act in near real-time.
Mature Event Analytics
  • A complex example is when a train has deviated from its planned schedule. This has knock-on effects for a number of interested parties, not only for the passengers, but also for the maintenance crews waiting for it at the end of day, and future schedules for that train. The analytics in this case are more intricate, needing the position of the train, its route, its priority, etc. The result of the processing is that multiple significant events are created and sent, one to the train twin (immediately it is known that it has deviated) and one each to the planned and actual depots, logistics partners and scheduling teams.

Automated scalability

Based on semantic meta data, new endpoints can be automatically found and created as digital twins in the operating environment and configured in the digital ecosystem. Programmatic and automated find-and-bind powers agile, flexible, extensible and scalable architecture for machine readability and AI opportunities.
Automated scalability
'Find and Bind'
  • Automated scalability avoids the need for re architecting a solution.
Unauthorised background image

Your product is just steps away!

Register for FREE to buy, get downloads and access free trials.

Get Resources and Start FREE Trials

Easy Product Activation with Your Account

Products Saved to Personal Digital Library