Table of Contents

go () at () % speed

Description

The block moves the Quarky robot in the specified direction. The direction can be “FORWARD”, “BACKWARD”, “LEFT”, and “RIGHT”.

  1. Forward:Robot Forward
  2. Backward:Robot Backward
  3. Left:Right Robot
  4. Right:Left Robot

Example

The example demonstrates how to control the motion of the robot using keyboard keys.

Script

Output

Wirelessly-Controlled-Robot-1

Read More
The example demonstrates how to make a line follower robot with Quarky.

Logic

Script

Alert: You need to calibrate the IR sensor to get the best line detection by the robot. Also, you need to calibrate the speeds to make the robot follow the line correctly.

Output

Read More
Learn how to code logic for video input detection with this example block code. You will be able to direct your own Mars Rover easily by just showing signs through the camera input.

Introduction

A sign detector Mars Rover robot is a robot that can recognize and interpret certain signs or signals, such as hand gestures or verbal commands, given by a human. The robot uses sensors, cameras, and machine learning algorithms to detect and understand the sign, and then performs a corresponding action based on the signal detected.

These robots are often used in manufacturing, healthcare, and customer service industries to assist with tasks that require human-like interaction and decision making.

Code

Initializing the Functions:

Main Code

Logic

  1. Firstly, the code sets up the stage camera to look for signs and detects and recognizes the signs showed on the camera.
  2. Next, the code starts a loop where the stage camera continuously checks for the signs.
  3. Finally, if the robot sees certain signs (like ‘Go’, ‘Turn Left’, ‘Turn Right’, or ‘U Turn’), it moves in a certain direction (forward, backward, left, or backward) based on the respective signs.
  4. This can help the Mars Rover to manoeuvre through the terrain easily by just showing signs on the camera.

Output

Forward-Backward Motions:

Right-Left Motions:

Read More
Learn to control Mars Rover using Dabble App on your device with customized functions for specialized circular motions.

Introduction

In this activity, we will control the Mars Rover according to our needs using the Dabble application on our own Devices.

We will first understand how to operate Dabble and how to modify our code according to the requirements. The following image is the front page of the Dabble Application.

Select the Gamepad option from the Home Screen and we will then use the same gamepad to control our Mars Rover.

Code

The following blocks represent the different functions that are created to control the Mars Rover for different types of motions. We will use the arrow buttons to control the basic movements.( Forward, Backward, Left, Right )

We will create our custom functions for specialized Circular motions of Mars Rover. We will use the Cross, Square, Circle, and Triangle buttons to control the Circular motions of Mars Rover.

Note: You will have to add the extensions of Mars Rover and also of Dabble to access the blocks.

The main code will be quite simple consisting of nested if-else loops to determine the action when a specific button is pressed on the Dabble Application.

You will have to connect the Quarky with the Dabble Application on your device. Make sure Bluetooth is enabled on the device before connecting. Connect the Rover to the Dabble application after uploading the code. You will be able to connect by clicking on the plug option in the Dabble Application as seen below. Select that plug option and you will find your Quarky device. Connect by clicking on the respective Quarky.

Important Notes

  1. The code will only run by uploading the code by connecting the rover with the help of a C-Type Cable to the Laptop.
  2. You will be able to upload the Python Code by selecting the Upload option beside the Stage option.
  3. There may be a case where you will have to upload the firmware first and then upload the code to the Rover. You will be able to upload the firmware in Quarky with the help of the following steps:
    1. Select the Quarky Palette from the Block Section.
    2. Select the Settings button on top of the palette.
    3. In the settings dialog box, scroll down, and select the Upload Firmware option. This will help you to reset the Quarky if any previous code was uploaded or not.
  4. After the Firmware is uploaded, click on the “Upload Code” option to upload the code.
  5. You will have to add the block “When Quarky Starts Up” rather than the conventional “When Green Flag is Clicked” for the code to run.

Output

Forward-Backward Motions:

Right-Left Motions:

Circular Left Motion:

Circular Right Motion:

Read More
Learn how to create custom sounds to control Mars Rover with the Audio Classifier of the Machine Learning Environment in PictoBlox. Start building your Sound Based Controlled Mars Rover now!

In this activity, we will use the Machine Learning Environment of the Pictoblox Software. We will use the Audio Classifier of the Machine Learning Environment and create our custom sounds to control the Mars Rover.

Audio Classifier Workflow

Follow the steps below to create your own Audio Classifier Model:

  1. Open PictoBlox and create a new file.
  2. Select the Block coding environment as appropriate Coding Environment.
  3. Select the “Open ML Environment” option under the “Files” tab to access the ML Environment.
  4. Click on “Create New Project“.
  5. A new window will open. Type in an appropriate project name of your choice and select the “Audio Classifier” extension. Click the “Create Project” button to open the Audio Classifier Window.
  6. You shall see the Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.
  7. As you can observe in the above image, we will add two classes for audio. We will be able to add audio samples with the help of the microphone. Rename the class1 as “Clap” and class2 as “Snap”.

Note: You can add more classes to the projects using the Add Class button.

Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data using the Microphone.
  3. You will be able to add the audio sample in each class and make sure you add atleast 20 samples for the model to run with good accuracy.
  4. Add the first class as “clap”  and record the audio for clap noises through the microphone.
  5. Add the second class as “snap” and record the audio for snap noises through the microphone.

Note: You will only be able to change the class name in the starting before adding any audio samples. You will not be able to change the class name after adding the audio samples in the respective class.

Training the Model

After data is added, it’s fit to be used in model training. In order to do this, we have to train the model. By training the model, we extract meaningful information from the hand pose, and that in turn updates the weights. Once these weights are saved, we can use our model to make predictions on data previously unseen.

The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch. Remember, the higher the reading in the accuracy graph, the better the model. The range of the accuracy is 0 to 1.

Testing the Model

To test the model simply, use the microphone directly and check the classes as shown in the below image:

You will be able to test the difference in audio samples recorded from the microphone as shown below:

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Block Coding Environment if you have opened the ML Environment in the Block Coding.

 

Logic

The Mars Rover will move according to the following logic:

  1. When the audio is identified as “clap”- Mars Rover will move forward.
  2. When the “snap” sound is detected –Mars Rover will move backward.

Note: You can add even more classes with different types of differentiating sounds to customize your control. This is just a small example from which you can build your own Sound Based Controlled Mars Rover in a very easy stepwise procedure.

 

Code

 

Logic

  1. First  we will initialize different Audio classes.
  2. Then, we will open the recognition window, which will identify different audio and turn on the microphone to identify and record the audio from the microphone.
  3. If the identified class from the analyzed audio is “clap,” the Mars Rover will move forward at a specific speed.
  4. If the identified class is “snap,” the Mars Rover will move backward.

Output

Read More
This project demonstrates how to use Machine Learning Environment to make a machine–learning model that identifies the hand gestures and makes the Mars Rover move accordingly.

This project demonstrates how to use Machine Learning Environment to make a machine–learning model that identifies hand gestures and makes the Mars Rover move accordingly.

We are going to use the Hand Classifier of the Machine Learning Environment. The model works by analyzing your hand position with the help of 21 data points.

Hand Gesture Classifier Workflow

Follow the steps below:

  1. Open PictoBlox and create a new file.
  2. Select the coding environment as appropriate Coding Environment.
  3. Select the “Open ML Environment” option under the “Files” tab to access the ML Environment.
  4. Click on “Create New Project“.
  5. A window will open. Type in a project name of your choice and select the “Hand Gesture Classifier” extension. Click the “Create Project” button to open the Hand Pose Classifier window.
  6. You shall see the Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.

Class in Hand Gesture Classifier

There are 2 things that you have to provide in a class:

  1. Class Name: It’s the name to which the class will be referred as.
  2. Hand Pose Data: This data can either be taken from the webcam or by uploading from local storage.

Note: You can add more classes to the projects using the Add Class button.

Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data using the Webcam or by Uploading the files from the local folder.
    1. Webcam:
Note: You must add at least 20 samples to each of your classes for your model to train. More samples will lead to better results.

Training the Model

After data is added, it’s fit to be used in model training. In order to do this, we have to train the model. By training the model, we extract meaningful information from the hand pose, and that in turn updates the weights. Once these weights are saved, we can use our model to make predictions on data previously unseen.

The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch. Remember, the higher the reading in the accuracy graph, the better the model. The range of the accuracy is 0 to 1.

Testing the Model

To test the model, simply enter the input values in the “Testing” panel and click on the “Predict” button.

The model will return the probability of the input belonging to the classes.

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Block Coding Environment if you have opened the ML Environment in the Block Coding.

Logic

The Mars Roverwill move according to the following logic:

  1. When the forward gesture is detected – Mars Rover will move forward.
  2. When the backward gesture is detected –Mars Rover will move backward.
  3. When the left gesture is detected –Mars Rover  will turn left.
  4. When the right gesture is detected – Mars Rover will turn right.

Code

Logic

  1. First, we will initialize different Gesture classes.
  2. Then, we will open the recognition window, which will identify different poses and turn on the camera with a certain level of transparency to identify images from the stage.
  3. If the identified class from the analyzed image is “forward,” the Mars Rover will move forward at a specific speed.
  4. If the identified class is “backward,” the Mars Rover will move backward.
  5. If the identified class is “left,” the Mars Rover will move left.
  6. If the identified class is “right,” the Mars Rover will move right.
  7. Otherwise, the Mars Rover will be in the home position.

Output


Read More
All articles loaded
No more articles to load