Table of Contents

analyse image from ()

Description

To execute face detection, we use analyse image from () blocks.

analyse image from ()

You can input the image in the following ways:

  1. Camera feed
  2. Stage

This block analyses the image and saves the face information locally, which can be accessed using other blocks similar to computer vision.

You have to run this block every time you want to analyze a new image from the camera or stage.

Example

The example shows how to create a face filter with Face Detection. It also includes how to make the filter tilt with face angles.

Script

Exmaple

Read More
Learn how to code logic for video input detection with this example block code. You will be able to direct your own Mars Rover easily by just showing signs through the camera input.

Introduction

A sign detector Mars Rover robot is a robot that can recognize and interpret certain signs or signals, such as hand gestures or verbal commands, given by a human. The robot uses sensors, cameras, and machine learning algorithms to detect and understand the sign, and then performs a corresponding action based on the signal detected.

These robots are often used in manufacturing, healthcare, and customer service industries to assist with tasks that require human-like interaction and decision making.

Code

Initializing the Functions:

Main Code

Logic

  1. Firstly, the code sets up the stage camera to look for signs and detects and recognizes the signs showed on the camera.
  2. Next, the code starts a loop where the stage camera continuously checks for the signs.
  3. Finally, if the robot sees certain signs (like ‘Go’, ‘Turn Left’, ‘Turn Right’, or ‘U Turn’), it moves in a certain direction (forward, backward, left, or backward) based on the respective signs.
  4. This can help the Mars Rover to manoeuvre through the terrain easily by just showing signs on the camera.

Output

Forward-Backward Motions:

Right-Left Motions:

Read More
Learn about face-tracking, and how to code a face-tracking Quadruped robot using sensors and computer vision techniques.

Activity Description

In this activity, students will program Quarky to detect a face’s position using the camera and respond with movements. Based on which direction the face is (left, right, or center), Quarky will display a pattern and move accordingly. This teaches camera-based input, angle calculations, and conditional movements.

Let’s Learn

  1. Open the PictoBlox application from the Start Menu.
  2. Select the inviting realm of Blocks as your coding environment.
  3. Connect “Quarky” to your computer using a USB cable. Then, click the Board button in the toolbar and Select Board as Quarky.
  4. Next, select the appropriate Serial port if the Quarky is connected via USB or the Bluetooth Port if you want to connect Quarky via Bluetooth and press Connect.
  5. Click on the Add Extension button and add the Quarky Quadruped extension.
  6. Add when flag clicked block from the Event Palette. This block helps you to start the script.
  7. To set up the quadruped, you can drag and drop pins for each leg and hip into the initialisation block using set pins FR Hip () FL Hip () FR Leg () FL Leg() BR Hip () BL Hip () BR Leg () BL Leg () blocks. This block sets which pins on the Quarky controller board control each servo motor for the front right (FR), front left (FL), back right (BR), and back left (BL) hips and legs. Drag this block and set each PIN as shown.  FR Hip: 4, FL Hip: 1, FR Leg: 8, FL Leg: 5, BR Hip: 3, BL Hip: 2, BR Leg: 7, BL Leg: 6.
  8. Turn on the camera video on the stage with 0% transparency so it remains visible.
  9. Begin a forever loop to keep checking the face’s position continuously.
  10. Use the analyse image from camera block to start facial recognition.
  11. Declare the Variable ‘Angle’ Place get () of the face () at the first place of addition () + (), and 3 at the second place. From the dropdown, select X position.
  12. Set the variable Angle by calculating 90 + (x position of face ÷ 3) to decide how far the face is from the center.
  13. Use if-else blocks to respond based on the face’s horizontal position: If Angle > 90: Face is on the right side, show a face on the LED matrix and move left using “lateral left” motion.
  14. Else if Angle < 90: Face is on the left side, show a face and move right using “lateral right” motion.
  15. Else (Angle = 90): Face is centered, Show a smiley face and move to the home (neutral) position.
Note: Check by changing the angle value and also try to change the icons in the display matrix as L for left and R for right sid directions.

Output

Our next step is to check whether it is working right or not. Whenever your face will come in front of the camera, it should detect it and as you move to the right or left, the head of your  Quadruped robot should also move accordingly.

Read More
All articles loaded
No more articles to load