OpenFace Container

Bundling an API with an existing container to facilitate analysis.

Overview

The OpenFace container uses a REST service to receive container names and downloads the media inside. This media is then analysed using the OpenFace library, after which the generated data is uploaded to the BLOB storage.

OpenFace as a Base

The base of this container is the OpenFace application: a library which allows the user to analyse images or videos and get information such as the head position and the values for each AU. More on this here. This is what is used to create the metrics which are then used in a later, separate analysis to determine the realism of a facial expression.

OpenFace can be interfaced with in three main ways: through a bundled user interface, by integrating it in a separate application or through the command line. The Avatar Validation Pipeline opts for the third option. User Interfaces are either impossible or impractical to implement in an entirely autonomous code-based pipeline, while the tight integration of OpenFace in a separate application would leave us with a far larger application than necessary. Running OpenFace through the terminal allows for an easy way to use OpenFace functionality while having to do very little to facilitate this ease of use. More on this here.

Now that this has been established, the next question is how to run these commands remotely in a cloud-based environment. Multiple options were explored such as triggering a bash script or using the Azure VM service, but eventually the most straightforward option seemed to be the use of a REST API.

Bundling a REST Service

Functionality

The REST service receives information about the media uploaded by the front-end and starts the OpenFace analysis on it. The results thereof are then uploaded, triggering the next module in the pipeline.

Technical Details

The REST service used here was created using Flask, a lightweight python library. The reason Django was not used here like in the other services, was because there was no need for the service to access the database at all. This removes one of the main selling points of Django, meaning a lighter option was perfectly viable.

REST services allow modules to be triggered remotely using HTTP calls. In this context, a call is made by the cloud function containing the type of media (for example an image) the type of that media (posed vs in-the-wild) and the name of the container containing the media in the BLOB. This container has a UUID. See below code block (some code omitted for brevity).

@app.route('/analyse/<media>/<media_type>/<name>')
def start_analysis(media, media_type, name):
    logger.info("Analysis started...")
    if(media, media_type, name):
        logger.warning("Analysing: (media: %s, type: %s, name: %s)", media, media_type, name)
        analyse = Analysis()
        if(media == 'video'):
            analyse.analyse_video(name, media_type)
        else:
            analyse.analyse_image(name, media_type)

    return Response(status=200)

The analysis itself is relatively simple. Depending on the aforementioned factors, a command is run in the terminal using the python os library (some code omitted for brevity).

def analyse_image(self, container_name, media_type):
    image = self.get_blob(container_name)
    if media_type.lower() == 'iw':
        os.system("../usr/local/bin/FaceLandmarkImg -f " + image + " -wild")
        self.upload_blob(image, container_name)
    if media_type.lower() == 'ip':
        os.system("../usr/local/bin/FaceLandmarkImg " + image)
        self.upload_blob(image, container_name)
except Exception as e:
    os.remove(image)

Initially, the image in the associated container is downloaded. Once this has been completed, the command itself can be run. The end result of the analysis is a set of files in /processed, the most important of which is a .csv file.

This file is uploaded to the BLOB storage, triggering the next service in the pipeline.

Docker implementation

Once the REST service and OpenFace container have been prepared, the next problem to solve is how to integrate these services. Docker works by starting from a base image and adding other variables or services to it in a Dockerfile. With this in mind, there were three options:

  • Use OpenFace as a base image and add the REST service.

  • Use the REST service (python) as a base image and add OpenFace.

  • Use an operating system (such as Ubuntu) as a base image and add both OpenFace and the REST service.

Each of these were explored and turned out to have more difficulties than first expected...

1. Use OpenFace as a base image and add the REST service

The issue here is that this not only forces the addition of the REST service, but also the installation of Python to make it work. When trying this, there were continuous issues installing Python (both downloading it using apt-get and when manually compiling it). It also required SSL, which proved difficult to get to work.

2. Use the REST service (python) as a base image and add OpenFace

It was expected this would be the most difficult solution from the onset. Just by taking a look at the included Dockerfile for OpenFace, one can see there are a large number of dependencies required such as OpenCV and dlib. This leaves a lot of room for error when trying to remake the image. Even if everything works, this creates an image far larger than one would normally expect. Aside from that, the version of OpenFace used in the publicly available image is based on a far older fork.

3. Use Ubuntu as a base, then add both OpenFace and the REST service

This could be done in two ways: one huge Dockerfile using Ubuntu as the base image with the REST service and OpenFace library together in one repository, or two separate Dockerfiles and using one of them as the base image. I opted for the latter as it is far less clutter and ends up with a very simple Dockerfile.

Step one was to create a base image. To do this, starting from Ubuntu, the original OpenFace Dockerfile was basically copied with one crucial addition: a Python installation. At the end of it, this would create an image of and up-to-date OpenFace that included Python. There is no need to host this image as it can be accessed locally.

The next step was to create the Dockerfile for the REST service. It simply uses the aforementioned image as a base, then installs the requirements and starts the web server.

FROM openface/openface:latest
WORKDIR /code
COPY . .
RUN pip install -r requirements.txt
EXPOSE 5000
CMD gunicorn -w 4 --bind 0.0.0.0:5000 app:app

Last updated

Was this helpful?