set head pin () FLeft () FRight () BLeft () BRight ()

Description

The block sets the servo connections of the specified location to the specified pins.

The servos are connected to the expansion board servo ports.

By default the following configuration is added:

  1. Head Servo – 4
  2. Front Left Servo – 1
  3. Front Right Servo – 5
  4. Back Left Servo – 2
  5. Back Right Servo – 6

Example

Learn how to code the Mars Rover to turn left and right on a circle with the set () to () block. Try different left and right orientations and move the Mars Rover with the up and down keys.

Introduction

In the last project, we looked at the Mars Rover control for turning left and right.

Instead of rotating the Mars Rover at a place to turn left or right, you can alternatives make the Mars Rover move in a circle.

  1. Turning left on a circle:
  2. Turning right on a circle:

This can be executed with the set () to () block. You have 2 other options Left and Right.

Left Orientation

Right Orientation

Coding Steps

The following code sets the servo motor position to the left, straight, and right when the a, s, and d keys are pressed. Then to move the Mars Rover, the code checks if the up and down key is pressed.

Make the code and play with the Mars Rover.

Output

Circular Right-Left Motion

Read More
Learn how to code logic for video input detection with this example block code. You will be able to direct your own Mars Rover easily by just showing signs through the camera input.

Introduction

A sign detector Mars Rover robot is a robot that can recognize and interpret certain signs or signals, such as hand gestures or verbal commands, given by a human. The robot uses sensors, cameras, and machine learning algorithms to detect and understand the sign, and then performs a corresponding action based on the signal detected.

These robots are often used in manufacturing, healthcare, and customer service industries to assist with tasks that require human-like interaction and decision making.

Code

Initializing the Functions:

Main Code

Logic

  1. Firstly, the code sets up the stage camera to look for signs and detects and recognizes the signs showed on the camera.
  2. Next, the code starts a loop where the stage camera continuously checks for the signs.
  3. Finally, if the robot sees certain signs (like ‘Go’, ‘Turn Left’, ‘Turn Right’, or ‘U Turn’), it moves in a certain direction (forward, backward, left, or backward) based on the respective signs.
  4. This can help the Mars Rover to manoeuvre through the terrain easily by just showing signs on the camera.

Output

Forward-Backward Motions:

Right-Left Motions:

Read More
Learn to control Mars Rover using Dabble App on your device with customized functions for specialized circular motions.

Introduction

In this activity, we will control the Mars Rover according to our needs using the Dabble application on our own Devices.

We will first understand how to operate Dabble and how to modify our code according to the requirements. The following image is the front page of the Dabble Application.

Select the Gamepad option from the Home Screen and we will then use the same gamepad to control our Mars Rover.

Code

The following blocks represent the different functions that are created to control the Mars Rover for different types of motions. We will use the arrow buttons to control the basic movements.( Forward, Backward, Left, Right )

We will create our custom functions for specialized Circular motions of Mars Rover. We will use the Cross, Square, Circle, and Triangle buttons to control the Circular motions of Mars Rover.

Note: You will have to add the extensions of Mars Rover and also of Dabble to access the blocks.

The main code will be quite simple consisting of nested if-else loops to determine the action when a specific button is pressed on the Dabble Application.

You will have to connect the Quarky with the Dabble Application on your device. Make sure Bluetooth is enabled on the device before connecting. Connect the Rover to the Dabble application after uploading the code. You will be able to connect by clicking on the plug option in the Dabble Application as seen below. Select that plug option and you will find your Quarky device. Connect by clicking on the respective Quarky.

Important Notes

  1. The code will only run by uploading the code by connecting the rover with the help of a C-Type Cable to the Laptop.
  2. You will be able to upload the Python Code by selecting the Upload option beside the Stage option.
  3. There may be a case where you will have to upload the firmware first and then upload the code to the Rover. You will be able to upload the firmware in Quarky with the help of the following steps:
    1. Select the Quarky Palette from the Block Section.
    2. Select the Settings button on top of the palette.
    3. In the settings dialog box, scroll down, and select the Upload Firmware option. This will help you to reset the Quarky if any previous code was uploaded or not.
  4. After the Firmware is uploaded, click on the “Upload Code” option to upload the code.
  5. You will have to add the block “When Quarky Starts Up” rather than the conventional “When Green Flag is Clicked” for the code to run.

Output

Forward-Backward Motions:

Right-Left Motions:

Circular Left Motion:

Circular Right Motion:

Read More
Explore the surroundings with our obstacle avoidance Mars Rover that uses an ultrasonic sensor to detect and avoid obstacles. Learn how the robot moves, detects obstacles, and navigates its way through them.

This project of obstacle avoidance is for a robot that will move around and look for obstacles. It uses an ultrasonic sensor to measure the distance. If the distance is less than 20 cm, it will stop and look in both directions to see if it can move forward. If it can, it will turn left or right. If not, it will make a U-turn.

Logic

  1. This code is making a robot move around and explore its surroundings. It has an ultrasonic sensor that can measure the distance between objects.
  2. We will first initialize the servos of the Mars Rover with the block “Set head pins()”.
  3. Then we will make all the servos rotate to 90 degrees if they are not initialized.
  4. Thereafter we will initialize the ultrasonic sensors and define the minimum and maximum distance variables.
  5. The main logic of the code is that it first checks whether the distance is less than the minimum distance. If it is, the head servo will move to 45 degrees and check again if the distance is greater than the maximum distance, hence moving in the right direction.
  6. The robot with the help of the head servo, will check the distance for the conditions 90 degrees, 45 degrees, 135 degrees, 0 degrees and 180 degrees in the same order as stated.
  7. Whenever the distance measured will be less than minimum distance the head servo will change the direction to the next set of degree to check distance.
  8. In the last case scenario where all the angles contain obstacles as such, in that case the robot will change its direction to reverse by rotating to 180 degrees. By this way the robot will be able to navigate its own way through each and every obstacles.

Code:

Main Functions:

 

Final Main Logic:

Output

 

Read More
Discover the exciting world of face-tracking robots and learn how to code one using sensors and algorithms.

Introduction

A face-tracking robot is a type of robot that uses sensors and algorithms to detect and track human faces in real time. The robot’s sensors, such as cameras or infrared sensors, capture images or videos of the surrounding environment and use computer vision techniques to analyze the data and identify human faces.

Face-tracking robots have many potential applications, including in security systems, entertainment, and personal robotics. For example, a face-tracking robot could be used in a museum or amusement park to interact with visitors, or in a home as a companion robot that can recognize and follow the faces of family members.

One of the most fascinating activities is face tracking, in which the Humanoid can detect a face and move its head in the same direction as yours. How intriguing it sounds, so let’s get started with the coding for a face-tracking Humanoid robot.

Logic

  1. If the face is tracked at the center of the stage, the Humanoid should be straight.
  2. As the face moves to the left side, the Humanoid will also move to the left side.
  3. As the face moves to the right side, the Humanoid will also move to the right side.

Code Explained

  1. Drag and drop the when green flag clicked block from the Events palette.
  2. Then, add a turn () video on stage with () % transparency block from the Face Detection extension and select one from the drop-down. This will turn on the camera.
  3. Add the set head pin () FLeft () FRight () BLeft () BRight () block from the Humanoid extension.
  4. Click on the green flag and your camera should start. Make sure this part is working before moving further.
  5. Add the forever block below turn () video on stage with () % transparency from the Control palette.
  6. Inside the forever block, add an analyzed image from the () block. This block will analyze the face the camera detects. Select the camera from the dropdown.
  7. Create a variable called Angle that will track the angle of the face. Based on the angle, the robot will move to adjust its position.
  8. Here comes the logical part as in this, the position of the face on the stage matters a lot. Keeping that in mind, we will add the division () / () block from the Operator palette into the scripting area.
  9. Place get () of the face () at the first place of addition () + (), and 3 at the second place. From the dropdown select X position.
  10. If the angle value is greater than 90, the Humanoid will move left at a specific speed. If the angle is less than 90, the Humanoid will move right at a specific speed. If the angle is exactly 90, the Humanoid will return to its home position.

Block Explained

  1. Create a variable called Angle and assign it the value of the face’s position.
  2. At the center of the stage, we will get the X position value which is zero.
  3. As we move to the left side the X position value will give you the negative value and as we move to the right side the X position value will give you the positive value.
  4. The x position value is divided by 3 which gives precise positioning.
  5. To set the angle at 90 when the face is at the center of the stage we have added 90 to the X position value.
  6. As we move to the left side the angle value will get decreased as the X position value is going in negative.
  7. As we move to the right side the angle value will get increased as the X position value is going in positive.

Code

Output

Our next step is to check whether it is working right or not. Whenever your face will come in front of the camera, it should detect it and as you move to the right or left, the head of your  Humanoid robot should also move accordingly.

Read More
Learn how to code logic for speech recognized control of Mars Rover with this example block code. You will be able to direct your own Mars Rover easily by just speaking commands.

Learn how to code logic for speech recognized control of Mars Rover with this example block code. You will be able to direct your own Mars Rover easily by just speaking commands.

Introduction

A speech recognized controlled Mars Rover robot is a robot that can recognize and interpret our speech, verbal commands, given by a human. The code uses the speech recognition model that will be able to record and analyze your speech given and react accordingly on the Mars Rover.

Speech recognition robots can be used in manufacturing and other industrial settings to control machinery, perform quality control checks, and monitor equipment.

They are also used to help patients with disabilities to communicate with their caregivers, or to provide medication reminders and other health-related information.

Main Code:

Logic

  1. Firstly, the code initializes the Mars Rover pins and starts recording the microphone of the device to store the audio command of the user.
  2. The code then checks conditions whether the command included the word “Go” or not. You can use customized commands and test for different conditions on your own.
  3. If the first condition stands false, the code again checks for different keywords that are included in the command.
  4. When any condition stands true, the robot will align itself accordingly and move in that direction of the respective command.

Output

Forward-Backward Motions:

Right-Left Motions:

Read More
Learn how to create custom sounds to control Mars Rover with the Audio Classifier of the Machine Learning Environment in PictoBlox. Start building your Sound Based Controlled Mars Rover now!

In this activity, we will use the Machine Learning Environment of the Pictoblox Software. We will use the Audio Classifier of the Machine Learning Environment and create our custom sounds to control the Mars Rover.

Audio Classifier Workflow

Follow the steps below to create your own Audio Classifier Model:

  1. Open PictoBlox and create a new file.
  2. Select the Block coding environment as appropriate Coding Environment.
  3. Select the “Open ML Environment” option under the “Files” tab to access the ML Environment.
  4. Click on “Create New Project“.
  5. A new window will open. Type in an appropriate project name of your choice and select the “Audio Classifier” extension. Click the “Create Project” button to open the Audio Classifier Window.
  6. You shall see the Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.
  7. As you can observe in the above image, we will add two classes for audio. We will be able to add audio samples with the help of the microphone. Rename the class1 as “Clap” and class2 as “Snap”.

Note: You can add more classes to the projects using the Add Class button.

Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data using the Microphone.
  3. You will be able to add the audio sample in each class and make sure you add atleast 20 samples for the model to run with good accuracy.
  4. Add the first class as “clap”  and record the audio for clap noises through the microphone.
  5. Add the second class as “snap” and record the audio for snap noises through the microphone.

Note: You will only be able to change the class name in the starting before adding any audio samples. You will not be able to change the class name after adding the audio samples in the respective class.

Training the Model

After data is added, it’s fit to be used in model training. In order to do this, we have to train the model. By training the model, we extract meaningful information from the hand pose, and that in turn updates the weights. Once these weights are saved, we can use our model to make predictions on data previously unseen.

The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch. Remember, the higher the reading in the accuracy graph, the better the model. The range of the accuracy is 0 to 1.

Testing the Model

To test the model simply, use the microphone directly and check the classes as shown in the below image:

You will be able to test the difference in audio samples recorded from the microphone as shown below:

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Block Coding Environment if you have opened the ML Environment in the Block Coding.

 

Logic

The Mars Rover will move according to the following logic:

  1. When the audio is identified as “clap”- Mars Rover will move forward.
  2. When the “snap” sound is detected –Mars Rover will move backward.

Note: You can add even more classes with different types of differentiating sounds to customize your control. This is just a small example from which you can build your own Sound Based Controlled Mars Rover in a very easy stepwise procedure.

 

Code

 

Logic

  1. First  we will initialize different Audio classes.
  2. Then, we will open the recognition window, which will identify different audio and turn on the microphone to identify and record the audio from the microphone.
  3. If the identified class from the analyzed audio is “clap,” the Mars Rover will move forward at a specific speed.
  4. If the identified class is “snap,” the Mars Rover will move backward.

Output

Read More
This project demonstrates how to use Machine Learning Environment to make a machine–learning model that identifies the hand gestures and makes the Mars Rover move accordingly.

This project demonstrates how to use Machine Learning Environment to make a machine–learning model that identifies hand gestures and makes the Mars Rover move accordingly.

We are going to use the Hand Classifier of the Machine Learning Environment. The model works by analyzing your hand position with the help of 21 data points.

Hand Gesture Classifier Workflow

Follow the steps below:

  1. Open PictoBlox and create a new file.
  2. Select the coding environment as appropriate Coding Environment.
  3. Select the “Open ML Environment” option under the “Files” tab to access the ML Environment.
  4. Click on “Create New Project“.
  5. A window will open. Type in a project name of your choice and select the “Hand Gesture Classifier” extension. Click the “Create Project” button to open the Hand Pose Classifier window.
  6. You shall see the Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.

Class in Hand Gesture Classifier

There are 2 things that you have to provide in a class:

  1. Class Name: It’s the name to which the class will be referred as.
  2. Hand Pose Data: This data can either be taken from the webcam or by uploading from local storage.

Note: You can add more classes to the projects using the Add Class button.

Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data using the Webcam or by Uploading the files from the local folder.
    1. Webcam:
Note: You must add at least 20 samples to each of your classes for your model to train. More samples will lead to better results.

Training the Model

After data is added, it’s fit to be used in model training. In order to do this, we have to train the model. By training the model, we extract meaningful information from the hand pose, and that in turn updates the weights. Once these weights are saved, we can use our model to make predictions on data previously unseen.

The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch. Remember, the higher the reading in the accuracy graph, the better the model. The range of the accuracy is 0 to 1.

Testing the Model

To test the model, simply enter the input values in the “Testing” panel and click on the “Predict” button.

The model will return the probability of the input belonging to the classes.

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Block Coding Environment if you have opened the ML Environment in the Block Coding.

Logic

The Mars Roverwill move according to the following logic:

  1. When the forward gesture is detected – Mars Rover will move forward.
  2. When the backward gesture is detected –Mars Rover will move backward.
  3. When the left gesture is detected –Mars Rover  will turn left.
  4. When the right gesture is detected – Mars Rover will turn right.

Code

Logic

  1. First, we will initialize different Gesture classes.
  2. Then, we will open the recognition window, which will identify different poses and turn on the camera with a certain level of transparency to identify images from the stage.
  3. If the identified class from the analyzed image is “forward,” the Mars Rover will move forward at a specific speed.
  4. If the identified class is “backward,” the Mars Rover will move backward.
  5. If the identified class is “left,” the Mars Rover will move left.
  6. If the identified class is “right,” the Mars Rover will move right.
  7. Otherwise, the Mars Rover will be in the home position.

Output


Read More
All articles loaded
No more articles to load
Table of Contents