PixelLib is a library created to enable easy implementation of object
segmentation in real life applications. PixelLib supports image tuning,
which is the ability to alter the background of any image. PixelLib now
supports video tuning, which is the ability to alter the background of
videos and camera’s feeds. PixelLib employs the technique of object
segmentation to perform excellent foreground and background subtraction. PixelLib makes use of deeplabv3+ model trained on pascalvoc dataset and the dataset supports 20 object categories.
person,bus,car,aeroplane, bicycle, ,motorbike,bird, boat, bottle, cat, chair, cow, dinningtable, dog, horse pottedplant, sheep, sofa, train, tv
Background effects supported are as follows:
1 Changing the background of an image with a picture
2 Assigning a distinct color to the background of an image and a video.
3 Blurring the background of an image and a video.
4 Grayscaling the background of an image of an image and a video.
5 Creating a virtual background for a video.
Install PixelLib and its dependencies:
Install Tensorflow with:(PixelLib supports tensorflow 2.0 and above)
pip3 install tensorflow
Install PixelLib with
pip3 install pixellib
If installed, upgrade to the latest version using:
pip3 install pixellib — upgrade
In some applications, you may want to target the detection of a particular
object in an image or a video. The deeplab model by default detects all
the objects it supports in an image or video. It is now possible to
filter out unused detections and target a particular object in an image
or a video.
We intend to blur the background of the image above.
Code to blur image’s background
import pixellib
from pixellib.tune_bg import alter_bg
change_bg = alter_bg(model_type = "pb")
change_bg.load_pascalvoc_model("xception_pascalvoc.pb")
change_bg.blur_bg("sample.jpg", extreme = True, output_image_name="output_img.jpg")
Our goal is to completely blur the background of the person in this
image, but we are not satisfied with the presence of other objects.
Therefore, there is need to modify the code to detect a target object.
import pixellib
from pixellib.tune_bg import alter_bg
change_bg = alter_bg(model_type = "pb")
change_bg.load_pascalvoc_model("xception_pascalvoc.pb")
change_bg.blur_bg("sample.jpg", extreme = True, output_image_name="output_img.jpg", detect = "person")
It is still the same code except we introduced an extra parameter detect in the blur_bg function.
change_bg.blur_bg("sample.jpg", extreme = True, output_image_name="output_img.jpg", detect = "person")
detect: This is the parameter that determines the target object to be detected. The value of detect is set to person. This means that the model will detect only person in the image.
This is the new image with only our target object shown.
If we intend to show only the cars present in this image, we just have to change the value of detect from person to car.
change_bg.blur_bg("sample.jpg", extreme = True, output_image_name="output_img.jpg", detect = "car")
Color background of target object
Target detections can be done with color effect.
change_bg.color_bg("sample.jpg", colors = (0,128,0), output_image_name="output_img.jpg", detect = "person")
Change the background of a target object with a new picture
background image
change_bg.change_bg_img("sample.jpg", "background.jpg", output_image_name="output_img.jpg", detect = "person")
Grayscale the background of a target object
change_bg.gray_bg("sample.jpg", output_image_name="output_img.jpg", detect = "person")
Read this article to have a comprehensive knowledge about background editing in images with PixelLib.
Video tuning is the ability to alter the background of any video.
Blur Video background
PixelLib makes it convenient to blur the background of any video using just five lines of code.
sample_video
code to blur the background of a video file
import pixellib
from pixellib.tune_bg import alter_bg
change_bg = alter_bg(model_type = "pb")
change_bg.load_pascalvoc_model("xception_pascalvoc.pb")
change_bg.blur_video("sample_video.mp4", extreme = True, frames_per_second=10, output_video_name="blur_video.mp4")
import pixellib
from pixellib.tune_bg import alter_bg
change_bg = alter_bg(model_type = "pb")
change_bg.load_pascalvoc_model("xception_pascalvoc.pb")
We imported pixellib, and from pixellib we imported in the class alter_bg. Instance of the class was created, and within the class, we added a parameter model_type and set it to pb. We finally called the function to load the model.
Note:
PixelLib supports two types of deeplabv3+ models, keras and tensorflow
models. The keras model is extracted from the tensorflow model’s
checkpoint. The tensorflow model performs better than the keras model
extracted from its checkpoint. We will make use of tensorflow model.
Download the tensorflow model from here .
There are three parameters that determine the degree to which the background is blurred.
low: When it is set to true, the background is blurred slightly.
moderate: When it is set to true, the background is moderately blurred.
extreme: When it is set to true, the background is deeply blurred.
change_bg.blur_video("sample_video.mp4", extreme = True, frames_per_second=10, output_video_name="blur_video.mp4", detect = "person")
This is the line of code that blurs the video’s background. This function takes in five parameters:
video_path: This is the path to the video file we want to blur its background.
extreme: It is set to true and the background of the video would be extremely blurred.
frames_per_second: This is the parameter to set the number of frames per second for the
output video file. In this case, it is set to 10 i.e the saved video
file will have 10 frames per second.
output_video_name: This is the saved video. The output video will be saved in your current working directory.
detect: This is the parameter that chooses the target object in the video. It is set to person.
output video
Blur the Background of Camera’s Feeds
import pixellib
from pixellib.tune_bg import alter_bg
import cv2
capture = cv2.VideoCapture(0)
change_bg = alter_bg(model_type = "pb")
change_bg.load_pascalvoc_model("xception_pascalvoc.pb")
change_bg.blur_camera(capture, frames_per_second=10,extreme = True, show_frames = True, frame_name = "frame", check_fps = True,
output_video_name="output_video.mp4", detect = "person")
import cv2
capture = cv2.VideoCapture(0)
We imported cv2 and included the code to capture camera’s frames.
change_bg.blur_camera(capture, extreme = True, frames_per_second= 5, output_video_name=”output_video.mp4", show_frames= True,frame_name= “frame”, check_fps = True, detect = "person")
In the code for blurring camera’s frames, we replaced the video’s filepath
to capture i.e we are going to process a stream of camera’s frames
instead of a video file.We added extra parameters for the purpose of
showing the camera’s frames:
show_frames: This is the parameter that handles showing of blurred camera’s frames.
frame_name: This is the name given to the shown camera’s frame.
check_fps: If you want to check the number of frames processed, just set the parameter check_fps to true.It will print out the number of frames per
seconds. In this case, it is 30 frames per second.
Output Video
Wow! PixelLib successfully blurred my background in the video.
PixelLib makes it super easy to create a virtual background for any video, and you can make use of any image to create a virtual background for a
video.
sample video
Image to serve as background for a video
import pixellib
from pixellib.tune_bg import alter_bg
change_bg = alter_bg(model_type="pb")
change_bg.load_pascalvoc_model("xception_pascalvoc.pb")
change_bg.change_video_bg("sample_video.mp4", "space.jpg", frames_per_second = 10, output_video_name="output_video.mp4", detect = "person")
change_bg.change_video_bg(“sample_video.mp4”, “bg.jpg”, frames_per_second = 10, output_video_name=”output_video.mp4", detect = “person”)
It is still the same code except we called the function change_video_bg
to create a virtual background for the video. The function takes in the
path of the image we want to use as background for the video.
Output Video
Beautiful demo! We are able to successfully create a virtual space background for the video.
Create a Virtual Background for Camera’s Feeds
import pixellib
from pixellib.tune_bg import alter_bg
import cv2
cap = cv2.VideoCapture(0)
change_bg = alter_bg(model_type="pb")
change_bg.load_pascalvoc_model("xception_pascalvoc.pb")
change_bg.change_camera_bg(cap, "space.jpg", frames_per_second = 5, show_frames=True, frame_name="frame", output_video_name="output_video.mp4", detect = "person")
change_bg.change_camera_bg(cap, “space.jpg”, frames_per_second = 5, show_frames=True, frame_name=”frame”, output_video_name=”output_video.mp4", detect = “person”)
It is similar to the code we used to blur camera’s frames. The only difference is that we called the function change_camera_bg. We performed the same routine, replaced the video’s filepath to capture, and added the same parameters.
Output Video
Wow! PixelLib successfully created a virtual background for my video.
Color Video background
PixelLib makes it possible to assign any color to the background of a video.
code to color the background of a video file
import pixellib
from pixellib.tune_bg import alter_bg
change_bg = alter_bg(model_type = "pb")
change_bg.load_pascalvoc_model("xception_pascalvoc.pb")
change_bg.color_video("sample_video.mp4", colors = (0,128,0), frames_per_second=10, output_video_name="output_video.mp4",
detect = "person")
change_bg.color_video("sample_video.mp4", colors = (0, 128, 0), frames_per_second=15, output_video_name="output_video.mp4", detect = "person")
It is still the same code, except we called the function color_video to give the video’s background a distinct color. The function color_bg takes the parameter colors, and colors’s RGB value is set to green. The RGB value of green color is (0, 128, 0).
output video
change_bg.color_video("sample_video.mp4", colors = (0, 128, 0), frames_per_second=15, output_video_name="output_video.mp4", detect = "person")
The same video with a white background
Color the Background of Camera’s Feeds
import pixellib
from pixellib.tune_bg import alter_bg
import cv2
capture = cv2.VideoCapture(0)
change_bg = alter_bg(model_type = "pb")
change_bg.load_pascalvoc_model("xception_pascalvoc.pb")
change_bg.color_camera(capture, frames_per_second=15,colors = (0, 128, 0), show_frames = True, frame_name = "frame", check_fps = True,
output_video_name="output_video.mp4", detect = "person")
change_bg.color_camera(capture, frames_per_second=10,colors = (0, 128, 0), show_frames = True, frame_name = “frame”, check_fps = True,output_video_name=”output_video.mp4", detect = "person")
It is similar to the code we used to create a virtual background for
camera’s frames. The only difference is that we called the function color_camera. We performed the same routine, replaced the video’s filepath to capture, and added the same parameters.
Output Video
Grayscale Video background
code to grayscale the background of a video file
import pixellib
from pixellib.tune_bg import alter_bg
change_bg = alter_bg(model_type = "pb")
change_bg.load_pascalvoc_model("xception_pascalvoc.pb")
change_bg.gray_video("sample_video.mp4", frames_per_second=10, output_video_name="output_video.mp4", detect = "person")
change_bg.gray_video("sample_video.mp4", frames_per_second=10, output_video_name="output_video.mp4", detect = "person")
output video
Note: The background of the video would be altered and the objects present would maintain their original quality.
Grayscale the Background of Camera’s Feeds
import pixellib
from pixellib.tune_bg import alter_bg
import cv2
capture = cv2.VideoCapture(0)
change_bg = alter_bg(model_type = "pb")
change_bg.load_pascalvoc_model("xception_pascalvoc.pb")
change_bg.gray_camera(capture, frames_per_second=10, show_frames = True, frame_name = "Ayo", check_fps = True,
output_video_name="output_video.mp4", detect = "person")
It is similar to the code we used to color camera’s frames. The only difference is that we called the function gray_camera. We performed the same routine, replaced the video filepath to capture, and added the same parameters.
Reach to me via:
Email: [email protected]
Linkedin: Ayoola Olafenwa
Twitter: @AyoolaOlafenwa
Facebook: Ayoola Olafenwa