Handoff

A handoff document detailing what has been made, what is needed and what could be done with the pipeline in the future.

What I Have Made

At the end of my internship I have delivered a functioning proof-of-concept of the Avatar Validation Pipeline. Users can upload images (creating a job), have them analysed and the metrics thereof compared to user-defined parameter sets. This comparison is then detailed in a report.

The status of these jobs can be monitored using the logs published to Azure. These logs can be queried based on the unique ID given to each job.

What I Think Is Still Needed

As this is a proof-of-concept, the project still requires some features and polish before it can be used in a professional environment. In this section, I aim to detail some of the most important features that would be needed.

Ability to Analyse Video/Scripts

As of writing, the pipeline only supports images. This could relatively easily be expanded to include videos by adding endpoints and BLOB logic for it. Eventually, however, the goal will likely be to use scripts instead of media.

The avatar generation pipeline uses a kind of script to store avatar data to replay it later. This is similar to how games store replays. Instead of storing game footage, games store inputs and replay them to create the same effect while needing far less storage. Creating a new module that can convert these scripts into avatar media would be ideal and help integrate the pipeline with the existing generation pipeline.

Proper Analysis Module

The current analysis module is very rudimentary and serves purely as a proof-of-concept. For actual use, a more advanced module or set of modules would be needed. These can be tailored to the needs of the developers.

External Application to View Logs

Currently, jobs are tracked using Azure Monitor. The suite it provided is extensive, but the use of it comes with two downsides: data is displayed in a developer-focused way and it requires access to the Azure account hosting the services.

Creating an external service with a more developed user interface would solve both of these problems. Azure Monitor has an API, making the development of a web application to view logs very easy.

Security and Other Non-Functionals

The development of the different services was focused entirely on getting something functional up and running, with no attention paid to other desirable features. One of these is security: each service can be accessed by anyone with the URL. There is currently no sensitive data, but at least a basic tokening system and the use of HTTPS should be added.

Future Development

The items I believe should be added first have already been mentioned previously. As for the current operation of the pipeline, the Pipeline Component articles have been written in such a way as to give a detailed overview of how the component works. I believe this is sufficient to understand what each component does and therefore to continue development.

Coming Closer to BUAS' Expectations

There will be a final meeting with BUAS in January to discuss what can be added to better integrate the two pipelines. More on this in the final presentation.

Generalising the Application

Though outside of the scope of the project at large, I believe there is a possibility for a pipeline like this to be used within the greater context of testing games or finding irregularities in images/video footage. The general flow of:

Input media => Acquire metrics => Analyse metrics => Store and report

Is generic enough to be widely applicable, but useful enough to be of value in many projects. The core difficulties here are creating modules that can acquire and then analyse metrics. The use of reference material can be beneficial here. By analysing an image of a game and comparing this to a reference image, things like graphical glitches or bugs could be found. If it is related to facial animations, it could even make use of OpenFace like the current pipeline.

Last updated

Was this helpful?