Zelle Payment

One can integrate facial recognition into their system by No code solution that is with the help of Our Request API. This is the fastest way to integrate facial software into the system . Another way is to use the advanced custom API’s.

API Integration Guide for Facial Recognition

One can integrate facial recognition into their system by No code solution that is with the help of Our Request API. This is the fastest way to integrate facial software into the system . Another way is to use the advanced custom API's.

1. Enroll API

Enroll API will be used to enroll all the faces required for biometric verification. This api consumes user details like name, phone, email etc to register their facial data into our system. The API in response gives the enroll url, which can be used as an iframe in any system.

Url https://api.satschel.com/v2/biometrics/enroll

Type:- POST,

Payload:-

{
  "name": "John Doe",
  "phone": "415-832-1231",
  "email": "[email protected]"
}

Response :

{
  "_id": "64367bb4e1ffb3db58997735",
  "url": "https: //api.satschel.com/v2/biometrics/face-enroll",
  "createdAt": "2023-06-08T08:13:51.165Z",
  "updatedAt": "2023-09-04T06:50:23.958Z"
}

2. Generate API

Generate api will generate an iframe url to initiate the facial biometric end to end process. This is the easiest way to integrate the zelle payment fraud into the system. As a request it will consume the amount, sender and recipient id (which can be obtained from enroll api) along with their phone numbers. Either sender/recipient id or their personal information is required as a payload for the request. A callbackurl which makes sure to notify each and every step completion during the entire flow.

Url :- https://api.satschel.com/v2/biometrics/generates

Type:- POST,

Payload:-

{
  "sender": {
    "id": "64e3824bd0b452ea835a9f7a",
    "phone": "4157671241",
    "name": "John Doe",
    "email": "[email protected]"
  },
  "recipient": {
    "id": "64e3824bd0b452ea835a9f7a",
    "phone": "413552423",
    "name": "Marry Doe",
    "email": "[email protected]"
  },
  "amount": "10.45",
  "callbackUrl": "https: //www.webhook.site/my-hook/1937193"
}

Response:-

{
  "_id": "64367bb4e1ffb3db58997735",
  "url": "https: //biometrics.simplici.io/64e3824bd0b452ea835a9f7a"
}

3. Webhooks

A webhook URL is a mechanism that allows one system or application to notify another system or application about a specific event or action that has occurred. In the context you provided, the webhook URL is used to inform a user that a request submitted into a system has been completed. The webhook URL can send responses to the user based on different event types. In this context, "event types" refer to the various stages or outcomes of the request submission and processing. These event types could include requests as - completed, sent, pending, processing, approved or rejected. Below are the events type for complete biometric flow.

  1. Enroll face -
{
  "event": "enroll-face",
  "apiVersion": "v2",
  "status": "completed",
  "url": "biometrics/enroll",
  "retryCount": 0,
  "refId": "64367bb4e1ffb3db58997735",
  "initiatedAt": "2023-10-20T09:08:43.1131080Z",
  "inputData": {
    "name": "John Doe",
    "phone": "415-832-1231",
    "email": "[email protected]"
  },
  "outputData": {
    "_id": "64367bb4e1ffb3db58997735",
    "url": "https: //api.satschel.com/v2/biometrics/face-enroll"
  },
  "completedAt": "2023-10-20T09:10:17.297Z"
}
  1. Recipient Notification -
{
  "event": "recipient-notify",
  "apiVersion": "v2",
  "status": "sent",
  "url": "biometrics/recipient/64367bb4e1ffb3db58997735",
  "retryCount": 0,
  "refId": "64367bb4e1ffb3db58997735",
  "initiatedAt": "2023-10-20T09:08:43.1131080Z",
  "inputData": {
    "name": "John Doe",
    "email": "[email protected]",
    "phone": "413552423",
    "approvalStatus": "pending"
  },
  "outputData": {
    
  },
  "completedAt": "2023-10-20T09:10:17.297Z"
}
  1. Approval/Denial Notification -
{
  "event": "sender-notify",
  "apiVersion": "v2",
  "status": "completed",
  "url": "biometrics/64367bb4e1ffb3db58997735/status",
  "retryCount": 0,
  "refId": "64367bb4e1ffb3db58997735",
  "initiatedAt": "2023-10-20T09:08:43.1131080Z",
  "inputData": {
    
  },
  "oututData": {
    "name": "John Doe",
    "email": "[email protected]",
    "phone": "413552423",
    "imageUrl": "https://storage.googleapis.com/xyz/images/face/userId.png",
    "approvalStatus": "rejected"
  },
  "completedAt": "2023-10-20T09:10:17.297Z"
}
  1. Facial Verification -
{
  "event": "facial-verify",
  "apiVersion": "v2",
  "status": "completed",
  "url": "biometrics/64367bb4e1ffb3db58997735/verify",
  "retryCount": 0,
  "refId": "64367bb4e1ffb3db58997735",
  "initiatedAt": "2023-10-20T09:08:43.1131080Z",
  "inputData": {
    
  },
  "oututData": {
    "name": "John Doe",
    "email": "[email protected]",
    "phone": "413552423",
    "liveness": true,
    "imageUrl": "https://storage.googleapis.com/xyz/images/face/userId.png",
    "facialVerified": true
  },
  "completedAt": "2023-10-20T09:10:17.297Z"
}
  1. Liveness Verification -
{
  "event": "liveness-verify",
  "apiVersion": "v2",
  "status": "completed",
  "url": "biometrics/64367bb4e1ffb3db58997735/live-verify",
  "retryCount": 0,
  "refId": "64367bb4e1ffb3db58997735",
  "initiatedAt": "2023-10-20T09:08:43.1131080Z",
  "inputData": {
    
  },
  "oututData": {
    "phone": "413552423",
    "liveness": true,
    "imageUrl": "https://storage.googleapis.com/xyz/images/face/userId.png",
    "facialVerified": false
  },
  "completedAt": "2023-10-20T09:10:17.297Z"
}

4. Get Transactions Details API

Transaction api will be showing the details of recipient and sender , amount transferred, their geolocation data and their compliance data (only if the user undergoes our onboarding flow) . The API also provides details for user agents like the device and browser information.

Url :- https: //api.stage.satschel.com/v2/pipelines/transactions/{id}

Type:- GET,

Response:-

{
  "message": "ok",
  "data": {
    "paymentInfo": {
      "ip": "44.143.12.12",
      "location": {
        "sender": {
          "accuracy": 13.547,
          "altitude": null,
          "altitudeAccuracy": null,
          "latitude": 12.9470625,
          "longitude": 77.6750879
        },
        "recipient": {
          "accuracy": 12.334,
          "altitude": null,
          "latitude": 12.9471004,
          "longitude": 77.6750603
        }
      },
      "userAgent": {
        "ua": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36",
        "browser": {
          "name": "Chrome",
          "version": "115.0.0.0",
          "major": "115"
        },
        "engine": {
          "name": "Blink",
          "version": "115.0.0.0"
        },
        "os": {
          "name": "Mac OS",
          "version": "10.15.7"
        },
        "device": {
          "vendor": "Apple",
          "model": "Macintosh"
        },
        "cpu": {
          
        }
      },
      "senderImage": "",
      "recipientImage": "",
      "recipient_id": "64f02c953a7c7e69cce30924",
      "amount": 1
    },
    "sender": {
      
    },
    "recipient": {
      
    },
    "senderSessionId": "64e3824bd0b452ea835a9f7a",
    "recipientSessionId": "63b510c99f81f2b4e59e5d9c",
    "date": 1693461747446,
    "senderName": "Austin Trombley",
    "recipientName": "Alina Trombley"
  }
}

Liveness detection in face scans is vital for enhancing security, preventing fraud, protecting privacy, and ensuring compliance with regulations. It plays a crucial role in the trustworthiness and reliability of biometric authentication systems.

We built an algorithm to identify a blink and we require two blinks during the facial scanning to ensure that the user is live and not a spoof, still image, or mask. Ensuring liveness in conjunction biometric similarity is critical to ensuring the user is who they say they are.

For Calculating the blink count, We use the act of blinking with what is called facial landmarks whereby we mathematically measure the distances between the eye landmarks and calculate the Eye Aspect Ratio for detecting a Blink.

Advance API Integration

Get Facial Data :

After the user face is enrolled in our system, get facial detail api will get the face enrollment data for the user by the help of the user id that is obtained from enroll api.

Url :- https://api.stage.satschel.com/v2/biometrics/get/64367bb4e1ffb3db58997721
Type:- GET,

Response:- ​​

​​{
  "_id": "64818dbff593e9e360e692fa",
  "name": "Alina Trombley",
  "phone": "4156783342d4",
  "countryCode": "+1",
  "email": "[email protected]",
  "userId": "64367bb4e1ffb3db58997721",
  "createdAt": "2023-06-08T08:13:51.165Z",
  "updatedAt": "2023-09-04T06:50:23.958Z",
  "imagePath": "arc-face/images/64367bb4e1ffb3db58997735/4410f3bb-8216-4bbc-a184-6e8cfa11f3ae.png",{
  "image": "imageurl",
  "width": 360"height": 360,
  
}
  "eyeRatio": [
    
  ],
  "image": "https://storage.googleapis.com/satschel-assets-public/arc-face/images/64367bb4e1ffb3db58997735/4410f3bb-8216-4bbc-a184-6e8cfa11f3ae.png?GoogleAccessId=nich-service-account%40glossy-fastness-305315.iam.gserviceaccount.com&Expires=1742149800&Signature=VBOx%2F6dhPJuRFjWit3k7ajpeZZlIwMdBusBaEvkrOQp89Og%2FCMQrM6G42i%2Bx3tz%2BLsUq%2BCu%2FDWZE0uO1K%2BsEbAMFALxygfcbMwX2GOqiJAIcnZx7OlHG7c11uKQnDr0DdSs3vRWthC197OhynNJv9JyNJH2OtJFm5wLhM4MMbsGo81H567IP1YAHGDLFmKmENcSk8rJtTHWA0V57yuz%2FTDOidjfhKg0P0L%2BygNA6XkQNLKr9ryQQ62iglUU8MHXa8VM9Pkcybght0BbrOrVxrHbOeq3bvLQJ39onQbJJyNvqpmQWw7AHTYAf2H68DpxeK6gBYEXV89hGorP1GwL6JQ%3D%3D",
  "descriptors": [
    "embeddingretrievefrombase64imageurl"
  ]
}

Face match

Now it's time for a facial match and recognition. Facial recognition can be done in two ways:
a) Server Side
b) Client Side

Server Side Approach :
In this approach we will be using a backend api to match the user image with existing embedding. In API we will be passing image base64 url and the size of image. In response it will provide the best matched image and details of the user

Url :- https://api.stage.satschel.com/v2/biometrics/match?type=arc-facial

Type:- POST,

Payload:-

{
  "image": imageurl,
  "width": 360,
  "height": 360, 
}

Response:-

{
  "match": true"matchedImageURL": """detection": "",
  "userId": "64367bb4e1ffb3db58997721",
  "name": "Alina Trombley",
  "phone": "4156783342d4",
  "countryCode": "+1",
  "email": "[email protected]",
  "userId": "64367bb4e1ffb3db58997721"
}

Client Side Approach :

We will be providing our iframe or webcomponent which can be directly embedded in the system. On load of the component the facial recognition modal will be directly loaded and will initiate the face detection. Below is the example for webcomponent.

<face-match-detection

type="arc-facial"

bgcolor="#f7f7f7"

theadcolor="#d9e1f2"

userId="def060617667464f9afc55fa00fd95df"

authToken="d28f0fb3be31a7a4df65e3a2c20843d7b3fba6c3d809bf6877a6af944e677552=="

btnprimary="#3c65d6"

btnsecondary="#ebeef5"

sidebar="true"

></face-match-detection>

<script src="https://app.simplici.io/web-component.js"></script>