Case Study

How Joot Accelerated Development and Reduced Overhead Costs With Serverless

Joot helps content creators, advertising and visual curators improve and predict their social media and advertising image engagement through machine learning and AI.

By using the Serverless Framework, Joot auto-scaled their infrastructure to handle web API, machine learning, and image processing workloads with their scrappy startup team.

Key Benefits
Serverless provided the platform to run everything from web APIs to AI and machine learning tasks.

A low-cost environment that saves over 70% in server costs by automatically scaling based on demand and compute needs.

Radically increased team efficiency by reducing development and devops tasks.
How Joot Works
Joot uses millions of data points to rank and score each uploaded image against a database of social posts. The result is an intelligent prediction tool that helps you better understand what your audience wants to see, resulting in higher return on ad spend, increased conversions, and more social engagement.
The Problem
Joot’s challenge
Joot needed to accelerate the development of their social media image optimization tool.

With a goal to provide clients with the knowledge of which images to use for ads, landing pages, and social posts. Joot set out to make it possible to know which images will perform the best across their customers ecommerce and social platforms.

As a brand new company, they faced the challenge of quickly deploying a fully functional tool that could easily scale with client acquisition. They needed a powerful framework solution that could serve multiple clients while being highly reliable and scalable. Some of the issues they faced when embarking on this project were:
Orchestrating 12 different steps in processing pipeline
Joot’s image processing architecture can take up to 12 different steps in their processing pipeline, and some functions can run several minutes due to machine learning processing and model creation. Serverless Framework made it easy to orchestrate those steps.
Resource management of containers and virtual machines
Unlike containers or virtual machines, going Serverless let Joot pay for only the resources they actually consumed and let them stage and tear down entire environments within minutes. The Serverless Framework streamlined everything - from development to deployment and operations, enabling them to create functioning web services within minutes and define their architecture in code. It also radically simplified duplication of their local dev environments to match what was in production.
“We didn’t start off Joot by trying something else and then deciding to go with  Serverless. This was a conscious decision from the very beginning. We needed to develop fast, get something up and running and to be able to design our architecture with existing services that would just work.”
Chris Crabtree
Co-Founder, Joot
The Solution
Why Joot chose Serverless
Serverless was the all-in-one solution for Joot. Joot’s team could install, set-up a project and from day one get something up and running. Serverless development provides an unprecedented amount of choice and flexibility to meet a vast array of scenarios - from web API’s to machine learning. It also allows you to do it super fast. The speed at which you can get things done is remarkable.
A scalable system that could scale up and down based on cluster demand and their machine-learning compute needs
Unlike dealing with a cluster of machines where there is no indication as to how many requests or how much actual traffic the cluster can handle, serverless services have a very specific limit attached to the service. This is great because your application won’t collapse under the weight of heavy traffic if you hit these limits, instead they return very clear responses via the SDK you are using to interact with them. Meaning that you can, in turn, much more cleanly handle service limits for your end users without resorting to the browsers default 500 error page.

It also means, however, that when you have times of no load or traffic, you aren’t still paying for a machine sitting around doing nothing. Serverless services only bill for when they are actually used, otherwise they scale down to zero!

Furthermore, if you find you are going to be using more of that service than the limits allow, in the vast majority of cases, you can just send AWS support a message to request an increase.
Complete (and ever-growing) suite of web-services to fit nearly any requirement
AWS services used:
If you take a look at AWS and the list of services they provide, there is an enormous variety to choose from. And even limiting the selection to serverless services allows for unlimited customization and creativity.

The image below shows the architecture Joot uses for image processing. All of Joot’s users are managed using Cognito. Amplify was used as the library for the web app. Lambda was used to coordinate the user authentication. When a user comes into the system, they upload an image which is saved directly into the S3. Triggers set on the image upload bucket go in and then pull that image out, resize it, do some processing and pull meta-data out and store it in DynamoDB.

Joot early on wanted to rely heavily on event-driven architecture using EventBridge. They also knew that early on there were a lot of properties they wanted to pull out of an image, quality, color, classification, etc. That event would then place that image information to be processed into several SQS queues each attached with Lambdas that would pull the image out and do the type of processing that they needed to. Once processing was completed, a new event would be triggered that would be picked up from another web service that would check to see if the model had been built for that market or that particular customer and then send it to a scoring queue. That would then take the image and build a training record to pass on to a model endpoint that had been built for that customer.

In SageMaker, one of the options is to host a multi-model or single-model endpoint. Joot has single-model endpoints for each model that was built for a customer, so when a customer built a model using Sagemaker through Joot’s system they had their own special endpoint to use to score for that market.
Event-driven Architecture
One thing that Joot found helpful was that for each event in EventBridge, the easiest way to keep things simple was to create handlers. Those handlers (shown below) are very simple lambdas. They are responsible for listening for an event, handling the event itself, and then performing whatever calls to other lambda functions that they may have in a microservice.
	service:
    handler : handleProcessImage.main
    environment:
        queueUrl:
            Fn: : ImportValue : ${self:custom.resourcesStage}-ProcessImageQueueUrl
    events:
        - eventBridge:
            pattern:
                detail-type:
                    - ‘Process Image Request’
                detail:
                    stage :
                        - ${self:custom.stage}
  
import AWS from "../../libs/aws-sdk";
import * as sqs from "../../libs/sqs-lib";

const lambda = new AWS.Lambda ({ region: "us-west-2" })

export async function main (event, context) {
    if (event.detail.use_queue) {
        await sqs.sendMessage(process.env.queueUrl, event.detail) ;
    }  else  {
        await triggerImageProcess(event.detail) ;
   }
}

async function triggerImageProcess(data) {
    let lambda_params = {
        FunctionName: process.env.processImageLambdaName,
        InvocationType: "Event", // asynchronous
        Payload: JSON.stringify(data),
        LogType: "None",
    );
    await lambda.invoke(lambda_params).promise () ;
}

service:
    handler : handleProcessImage.main
    environment:
        queueUrl:
            Fn: : ImportValue : ${self:custom.resourcesStage}-ProcessImageQueueUrl
    events:
        - eventBridge:
            pattern:
                detail-type:
                    - ‘Process Image Request’
                detail:
                    stage :
                        - ${self:custom.stage}
import AWS from "../../libs/aws-sdk";
import
* as sqs from "../../libs/sqs-lib";

const
lambda = new AWS.Lambda ({ region: "us-west-2" })

export
async function main (event, context) {
    if (
event.detail.use_queue) {
        await
sqs.sendMessage(process.env.queueUrl, event.detail) ;
    }  else  {
        await
triggerImageProcess(event.detail) ;
   }
}

async function
triggerImageProcess(data) {
    let
lambda_params = {
        
FunctionName: process.env.processImageLambdaName,
       
 InvocationType: "Event", // asynchronous
        
Payload: JSON.stringify(data),
        
LogType: "None",
    );
    await
lambda.invoke(lambda_params).promise () ;
}
Flexible development environment supporting multiple languages
Low cost environment to develop, test, and experiment
Serverless at its core makes use of commoditized cloud services so that you only actually pay for what you actually use. As opposed to renting server capacity that you need and ensuring it can handle your max load, serverless services can reduce capacity to zero when needed.

In addition, because the use of these fully managed services reduces load on your dev ops team, it means you may not need to staff that high for this rare and expensive resource. Being able to save the cost of manpower for a position that is as expensive as a DevOps engineer.

When you begin to factor-in not just the AWS bill generated by building Serverless applications but the expenses saved on additional staff to manage infrastructure as well as the opportunity cost lost because you cannot be as productive.
“One of the number one factors we chose Serverless was because we didn’t want to depend on dev-ops and worry about whether we had enough resources available.”
Chris Crabtree
Co-Founder, Joot
Fast and agile to develop on Serverless
Not only do serverless services handle scaling for you automatically, the way services such as AWS Lambda are architected make it less likely to suffer from noisy neighbor issues that are prevalent on traditional server based applications.

When you build a traditional server-based application that then gets hosted on virtual machines (or container clusters that are hosted on virtual machines) each server will be processing multiple threads simultaneously on the same CPU and memory. For example, in an API backend, if multiple client requests are made to the backend, those requests will initiate invocations of application code in separate threads within the OS. If one of those threads happens to be non-optimal, it would consume more CPU resources for example or use more memory. With less resources available to process these less efficient requests, that means less simultaneous threads can be processed on the same infrastructure, and the larger the cluster will need to be to handle that request.

If the code is inefficient enough, it can escalate the issue resulting in the existing clusters of virtual machines being overwhelmed with traffic and the application as a whole not being able to spin-up enough additional capacity in time to prevent the cascading effect of everything just falling over.

This means that right up front, teams need to invest at least some effort in proving the performance of the application which can be difficult to do when your primary focus is building features, and requires time and resources to load test your app.

Let's move this to a serverless world where that same slightly inefficient code is executed in AWS Lambda instead. There are tests that have been done that show that the “noisy neighbor” effect between multiple Lambda functions is small-to-insignificant. Lambda functions are so well isolated, that an inefficient piece of code doesn’t create a cascading effect that brings down an entire cluster. At worst, you see latencies for requests go up. And the moment you realize this is happening you can refactor your service.

With Serverless, you can get the feature out first, then optimize later with the confidence that it will scale.
The Results
Joot accelerated development and reduced costs with Serverless
With the help of Serverless, Joot, in a 6 month period of time, was able to:
Inherently scalable by design with drastically reduced resource-management requirements
Joot’s “no-ops” approach to growth cut down on handyman work. Without the bustle of infrastructure and server maintenance, Joot was able to deploy rapidly using a rich bank of building blocks and tools. The Serverless Framework energized rapid software development for Joot with a potential 70-90% reduction in server costs because of its “pay per use” model.
A fully functional application delivering machine learning predictive AI
By using serverless development, Joot leveraged the unique tools available to overcome time constraints and development hurdles.

Key Results:
“We were able to construct everything we needed within a 6 month period of time and it was wonderful. Kudos to the Serverless team, our experience at Joot was just phenomenal being able to use Serverless to build this whole application from scratch.”
Chris Crabtree
Co-Founder, Joot
Closing note
What’s next for Joot
Joot continues to improve and expand its image insights and engagement prediction offerings by recently introducing a number of free finely-tuned AI models for markets across industries ranging from children’s clothing, to rap music, to fine wine, to luxury hotels and over 100 more.
For more information about Joot visit:
“We wanted to provide a way for users to come in and immediately try out the image ranking in their target industry without having to create and wait for the production of a custom model.

We continue to look for opportunities to integrate Joot as a must-have technology for image engagement prediction in social-media and advertising management platforms. We believe Serverless will allow for these future integrations to be a much smoother and seamless process.”
Chris Crabtree
Co-Founder, Joot