Data Flow
Describing the flow of data within the pipeline.

This diagram aims to highlight the flow of data within the pipeline. For now, it is assumed that all data will be stored in a single data storage instance (Pipeline Data). In the practical implementation this is likely to be expanded upon, however (mainly that media will be stored in an Azure File Share or BLOB Storage and merely referred to in the database).
Users can provide two kinds of data: media (image/video) or parameter sets. Parameters are based on user input and are stored as parameter sets. Media is first analysed using OpenFace, after which both the media item and OpenFace data are stored. This data is later analysed by the custom analysis module, in which OpenFace data is compared to a specified set of parameters. The results of this analysis are then compiled in a report, which is then also stored.
Logs
One item that was not yet mentioned is how logs are created. Each action/service generates log data which is sent to a custom logger. This logger makes use of a specific Azure library that sends data to Azure Monitor/Application Insights. More info here.

Last updated
Was this helpful?