2024-11-21 05:55:55 +00:00
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"vscode": {
"languageId": "plaintext"
}
},
"source": [
"## Example Usage of UniFace Library for Face Alignment\n",
"This guide demonstrates how to use the **UniFace** library for face detection and face alignment. Follow the steps below to set up and execute the example.\n",
"\n",
"## 1. Install UniFace\n",
"Install the **UniFace** library using `pip`. The `-q` flag suppresses logs for a clean output."
]
},
{
"cell_type": "code",
2025-03-26 11:55:56 +09:00
"execution_count": null,
2024-11-21 05:55:55 +00:00
"metadata": {},
"outputs": [],
"source": [
"!pip install -q uniface"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 2. Import Required Libraries\n",
"Import the necessary libraries for image processing, visualization and face alignment:"
]
},
{
"cell_type": "code",
2025-03-26 11:55:56 +09:00
"execution_count": 2,
2024-11-21 05:55:55 +00:00
"metadata": {},
"outputs": [],
"source": [
"import cv2\n",
"import numpy as np\n",
"import matplotlib.pyplot as plt\n",
2025-03-26 11:55:56 +09:00
"from uniface import RetinaFace, face_alignment, draw_detections\n",
"from uniface.constants import RetinaFaceWeights"
2024-11-21 05:55:55 +00:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"- `cv2`: Used for image reading and processing.\n",
"- `numpy`: Used for converting model outputs to numpy.\n",
"- `matplotlib`: To display inference resulst\n",
"- `RetinaFace`: The model class from the **UniFace** library.\n",
"- `face_alignment`: A utility function for face alignment.\n",
"- `draw_detections`: A utility function to draw bounding boxes and landmarks on the image.\n",
"\n",
"## 3. Initialize the RetinaFace Model\n",
"Initialize the RetinaFace model with a lightweight pre-trained backbone and detection parameters:"
]
},
{
"cell_type": "code",
2025-03-26 11:55:56 +09:00
"execution_count": 3,
2024-11-21 05:55:55 +00:00
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
2025-03-26 11:55:56 +09:00
"2025-03-26 11:44:08,755 - INFO - Initializing RetinaFace with model=RetinaFaceWeights.MNET_V2, conf_thresh=0.5, nms_thresh=0.4, pre_nms_topk=5000, post_nms_topk=750, dynamic_size=False, input_size=(640, 640)\n",
"2025-03-26 11:44:08,755 - INFO - Model 'RetinaFaceWeights.MNET_V2' already exists at C:\\Users\\yakhyo\\.uniface\\models\\RetinaFaceWeights.MNET_V2.onnx\n",
"2025-03-26 11:44:08,767 - INFO - Verified model weights located at: C:\\Users\\yakhyo/.uniface/models\\RetinaFaceWeights.MNET_V2.onnx\n",
"2025-03-26 11:44:08,825 - INFO - Successfully initialized the model from C:\\Users\\yakhyo/.uniface/models\\RetinaFaceWeights.MNET_V2.onnx\n"
2024-11-21 05:55:55 +00:00
]
}
],
"source": [
"# Initialize the RetinaFace model\n",
"uniface_inference = RetinaFace(\n",
2025-03-26 11:55:56 +09:00
" model_name=RetinaFaceWeights.MNET_V2, # Model name\n",
" conf_thresh=0.5, # Confidence threshold\n",
" pre_nms_topk=5000, # Pre-NMS Top-K detections\n",
" nms_thresh=0.4, # NMS IoU threshold\n",
" post_nms_topk=750 # Post-NMS Top-K detections,\n",
2024-11-21 05:55:55 +00:00
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 4. Load and perform inference\n",
"Load set of input images to perform face detection and alignment, storing the results for visualization."
]
},
{
"cell_type": "code",
2025-03-26 11:55:56 +09:00
"execution_count": 4,
2024-11-21 05:55:55 +00:00
"metadata": {},
"outputs": [],
"source": [
"# Paths to the input images\n",
"image_paths = [\n",
2025-01-09 05:16:32 +00:00
" \"../assets/test_images/image0.jpg\",\n",
" \"../assets/test_images/image1.jpg\",\n",
" \"../assets/test_images/image2.jpg\",\n",
" \"../assets/test_images/image3.jpg\",\n",
" \"../assets/test_images/image4.jpg\",\n",
2024-11-21 05:55:55 +00:00
"]\n",
"\n",
"# Lists to store detection results and aligned images\n",
"detection_images = []\n",
"aligned_images = []\n",
2025-01-09 05:16:32 +00:00
"original_images = []\n",
2024-11-21 05:55:55 +00:00
"\n",
"# Process each image\n",
"for image_path in image_paths:\n",
" # Load the image\n",
" input_image = cv2.imread(image_path)\n",
" if input_image is None:\n",
" print(f\"Error: Could not read image from {image_path}\")\n",
" continue\n",
" \n",
" # Perform face detection\n",
" boxes, landmarks = uniface_inference.detect(input_image)\n",
"\n",
" if len(landmarks) == 0:\n",
" print(f\"No face detected in {image_path}\")\n",
" continue\n",
" \n",
" # Draw detections on the image for visualization\n",
" bbox_image = input_image.copy()\n",
" draw_detections(bbox_image, (boxes, landmarks), vis_threshold=0.6)\n",
"\n",
" # Align the first detected face\n",
" landmark_array = landmarks[0]\n",
2025-03-13 23:32:02 +09:00
" aligned_image, _ = face_alignment(input_image, landmark_array, image_size=112)\n",
2024-11-21 05:55:55 +00:00
" \n",
" # Convert images to RGB format for proper visualization\n",
" bbox_image = cv2.cvtColor(bbox_image, cv2.COLOR_BGR2RGB) \n",
" aligned_image = cv2.cvtColor(aligned_image, cv2.COLOR_BGR2RGB)\n",
" \n",
" # Store the processed images for visualization\n",
2025-01-09 05:16:32 +00:00
" original_images.append(input_image)\n",
2024-11-21 05:55:55 +00:00
" detection_images.append(bbox_image)\n",
" aligned_images.append(aligned_image)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 5. Display inference results\n",
"Visualization of face detection and alignment."
]
},
{
"cell_type": "code",
2025-03-26 11:55:56 +09:00
"execution_count": 5,
2024-11-21 05:55:55 +00:00
"metadata": {},
"outputs": [
{
"data": {
2025-03-26 11:55:56 +09:00
"image/png": "iVBORw0KGgoAAAANSUhEUgAABdIAAAPMCAYAAABG+kt2AAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjAsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvlHJYcgAAAAlwSFlzAAAPYQAAD2EBqD+naQABAABJREFUeJzsfWmQZNlV3vfey31faq/uqu6e7tmkWSQNMwIJSWCEjLFBchAmMGCBAQcyxgqMMDgcCMmBjQkQKAzYYIODLbDBIAsCYQMhJGxZ0qAZidmn96X2JfetcnvPP7K+kydvv6yu7mkNyH4noqKqMt9yl3PP8p1zz7U8z/MQUEABBRRQQAEFFFBAAQUUUEABBRRQQAEFFFBAAfmS/VfdgIACCiiggAIKKKCAAgoooIACCiiggAIKKKCAAvrrTAGQHlBAAQUUUEABBRRQQAEFFFBAAQUUUEABBRRQQEdQAKQHFFBAAQUUUEABBRRQQAEFFFBAAQUUUEABBRTQERQA6QEFFFBAAQUUUEABBRRQQAEFFFBAAQUUUEABBXQEBUB6QAEFFFBAAQUUUEABBRRQQAEFFFBAAQUUUEABHUEBkB5QQAEFFFBAAQUUUEABBRRQQAEFFFBAAQUUUEBHUACkBxRQQAEFFFBAAQUUUEABBRRQQAEFFFBAAQUU0BEUAOkBBRRQQAEFFFBAAQUUUEABBRRQQAEFFFBAAQV0BAVAekABBRRQQAEFFFBAAQUUUEABBRRQQAEFFFBAAR1BAZAeUEABBRRQQAEFFFBAAQUUUEABBRRQQAEFFFBAR9BtAem/+qu/Csuy8NRTT32x2nNb1G638YEPfACf/OQnj3X9Jz/5SViWhd/93d/94jYsoIACCiiggAIKKKCAAgoooIACCiiggAIKKKD/Z+hLOiO93W7jgx/84LGB9IACCiiggAIKKKCAAgoooIACCiiggAIKKKCAArpd+pIG0gMKKKCAAgoooIACCiiggAIKKKCAAgoooIACCuiLTa8YSP+O7/gOpFIpbGxs4J3vfCdSqRRmZ2fxvve9D8PhUK67du0aLMvCT//0T+Nnf/Znsbq6ing8jre+9a14/vnnJ575tre9DW9729t833Xq1Cl53uzsLADggx/8ICzLgmVZ+MAHPnBb7f/ABz4Ay7Jw4cIFfNu3fRuy2SxmZ2fxoz/6o/A8D2tra/jGb/xGZDIZLCws4EMf+tDE/b1eD+9///vxhje8AdlsFslkEl/5lV+JT3ziEze9q1Qq4du//duRyWSQy+Xw7ne/G8888wwsy8Kv/uqvTlz78ssv45u+6ZtQKBQQi8Xw2GOP4Q/+4A8mrun3+/jgBz+Ic+fOIRaLoVgs4s1vfjP+9E//9LbGIKCAAgoooIACCiiggAIKKKCAAgoooIACCiig6XRXMtKHwyHe8Y53oFgs4qd/+qfx1re+FR/60IfwH//jf7zp2l//9V/Hv/t3/w7f933fh3/xL/4Fnn/+eXz1V381dnZ2buuds7Oz+A//4T8AAN71rnfhN37jN/Abv/Eb+Lt/9+/eUR+++Zu/Ga7r4t/+23+LJ554Aj/+4z+OD3/4w3j729+O5eVl/ORP/iTOnj2L973vffhf/+t/yX31eh2//Mu/jLe97W34yZ/8SXzgAx/A3t4e3vGOd+Av//Iv5TrXdfF3/s7fwX/5L/8F7373u/Gv//W/xtbWFt797nff1JYXXngBb3zjG/HSSy/hR37kR/ChD30IyWQS73znO/Hf//t/l+s+8IEP4IMf/CC+6qu+Cj//8z+Pf/kv/yVWVlbw+c9//o7GIKCAAgoooIACCiiggAIKKKCAAgoooIACCiigmyl0Nx5ycHCAb/7mb8aP/uiPAgC+93u/F69//evxK7/yK3jPe94zce2lS5dw8eJFLC8vAwD+5t/8m3jiiSfwkz/5k/iZn/mZY78zmUzim77pm/Ce97wHDz/8ML7t277tFfXh8ccfxy/90i8BAP7RP/pHOHXqFH7wB38QP/ETP4Ef/uEfBgB8y7d8C5aWlvCf//N/xlve8hYAQD6fx7Vr1xCJRORZ3/M934P7778fP/dzP4df+ZVfAQB89KMfxWc+8xl8+MMfxnvf+14AwHve8x68/e1vv6kt733ve7GysoLPfe5ziEajAIB//I//Md785jfjh3/4h/Gud70LAPCxj30Mf+tv/S3fgEVAAQUUUEABBRRQQAEFFFBAAQUUUEABBRRQQHeH7lqN9O/93u+d+P8rv/IrceXKlZuue+c73ykgOjACsJ944gn80R/90d1qyh3Rd3/3d8vfjuPgscceg+d5+K7v+i75PJfL4b777pvol+M4AqK7rotyuYzBYIDHHntsIjP8f/7P/4lwOIzv+Z7vkc9s28b3fd/3TbSjXC7jz/7sz/D3/t7fQ6PRwP7+Pvb391EqlfCOd7wDFy9exMbGhrTnhRdewMWLF+/uYAQUUEABBRRQQAEFFFBAAQUUUEABBRRQQAEFJHRXgPRYLCb1ykn5fB6VSuWma8+dO3fTZ/feey+uXbt2N5pyx7SysjLxfzabRSwWw8zMzE2fm/36tV/7NTz88MNSp3x2dhYf+9jHUKvV5Jrr169jcXERiURi4t6zZ89O/H/p0iV4nocf/dEfxezs7MTPj/3YjwEAdnd3AQD/6l/9K1SrVdx777146KGH8EM/9EN49tlnX9lABBRQQAEFFFBAAQUUUEABBRRQQAEFFFBAAQU0QXeltIvjOHfjMUKWZcHzvJs+14eX3m3y68O0fum2/eZv/ia+4zu+A+985zvxQz/0Q5ibm4PjOPiJn/gJXL58+bbb4bouAOB973sf3vGOd/heQ/D9LW95Cy5fvozf//3fx5/8yZ/gl3/5l/GzP/uz+MVf/MWJDPuAAgoooIACCiiggAIKKKCAAgoooIACCiiggO6c7gqQfjvkV4bkwoULOHXqlPyfz+d9y8Jcv3594n/Lsu56+26Xfvd3fxdnzpzBRz7ykYn2MHuctLq6ik984hNot9sTWemXLl2auO7MmTMAgHA4jK/5mq+55fsLhQK+8zu/E9/5nd+JZrOJt7zlLfjABz4QAOkBBRRQQAEFFFBAAQUUUEABBRRQQAEFFFBAd4nuWo3049JHP/pRqfENAH/xF3+BJ598El/3dV8nn91zzz14+eWXsbe3J58988wz+D//5/9MPIuAdLVa/eI2+ghi1rrOUn/yySfxmc98ZuK6d7zjHej3+/hP/+k/yWeu6+IXfuEXJq6bm5vD2972NvzSL/0Stra2bnqfHpNSqTTxXSqVwtmzZ9Htdu+8QwEFFFBAAQUUUEABBRRQQAEFFFBAAQUUUEABTdCrnpF+9uxZvPnNb8Z73vMedLtdfPjDH0axWMQ//+f/XK75h//wH+JnfuZn8I53vAPf9V3fhd3dXfziL/4iXvOa16Ber8t18XgcDz74IH77t38b9957LwqFAl772tfita997avWn7/9t/82PvKRj+Bd73oXvv7rvx5Xr17FL/7iL+LBBx9Es9mU6975znfi8ccfxw/+4A/i0qVLuP/++/EHf/AHKJfLACaz63/hF34Bb37zm/HQQw/he77ne3DmzBns7OzgM5/5DNbX1/HMM88AAB588EG87W1vwxve8AYUCgU89dRT+N3f/V38k3/yT161/gcUUEABBRRQQAEFFFBAAQUUUEABBRRQQAH9v06vOpD+D/7BP4Bt2/jwhz+M3d1dPP744/j5n/95LC4uyjUPPPAAfv3Xfx3vf//78c/+2T/Dgw8+iN/4jd/Ab/3Wb+GTn/zkxPN++Zd/Gd///d+PH/iBH0Cv18OP/diPvapA+nd8x3dge3sbv/RLv4Q//uM/xoMPPojf/M3fxH/7b/9toq2O4+BjH/sY3vve9+LXfu3XYNs23vWud+HHfuzH8KY3vQmxWEyuffDBB/HUU0/hgx/8IH71V38VpVIJc3NzeN3rXof3v//9ct0//af/FH/wB3+AP/mTP0G328Xq6ip+/Md/HD/0Qz/0qvU/oIACCiiggAIKKKCAAgoooIACCiiggAIK6P91sjy/Uz2/CHTt2jW
2024-11-21 05:55:55 +00:00
"text/plain": [
2025-01-09 05:16:32 +00:00
"<Figure size 1500x1000 with 15 Axes>"
2024-11-21 05:55:55 +00:00
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"# Plot images in a 2-row layout\n",
2025-01-09 05:16:32 +00:00
"fig, axes = plt.subplots(3, len(image_paths), figsize=(15, 10))\n",
2024-11-21 05:55:55 +00:00
"\n",
"# Titles for each row\n",
2025-01-09 05:16:32 +00:00
"row_titles = [\"Input Images\", \"Detection Results\", \"Face Alignment\"]\n",
2024-11-21 05:55:55 +00:00
"\n",
"# Populate the grid with images\n",
2025-01-09 05:16:32 +00:00
"for row, images in enumerate([original_images, detection_images, aligned_images]):\n",
2024-11-21 05:55:55 +00:00
" for col, img in enumerate(images):\n",
" # Display each image in the grid\n",
" axes[row, col].imshow(img)\n",
" axes[row, col].axis(\"off\") # Remove axes for cleaner visuals\n",
" \n",
" # Set row title on the first column of each row\n",
" if col == 0:\n",
2025-01-09 05:16:32 +00:00
" axes[row, col].set_title(row_titles[row], fontsize=12, loc=\"left\")\n",
" \n",
2024-11-21 05:55:55 +00:00
"\n",
"# Adjust layout to prevent overlap and display the plot\n",
"plt.tight_layout()\n",
"plt.show()\n"
]
2025-01-09 05:16:32 +00:00
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
2024-11-21 05:55:55 +00:00
}
],
"metadata": {
"kernelspec": {
2025-03-26 11:55:56 +09:00
"display_name": "base",
2024-11-21 05:55:55 +00:00
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
2025-03-26 11:55:56 +09:00
"version": "3.12.2"
2024-11-21 05:55:55 +00:00
}
},
"nbformat": 4,
"nbformat_minor": 2
}