set () to ()

Description

The block sets the four servo motors of the legs to align with the specified orientation – inside, left, and right.

Inside

Left

Right

Example

Discover a unique experience in a Synonym/antonyms World, where the combined powers of Speech Recognition and ChatGPT Extension.

Introduction

Hey! Welcome to the fascinating realm of “Synonym/Antonym World,” where the powers of Speech Recognition and ChatGPT converge. Immerse yourself in an innovative platform that not only recognizes your speech but also provides an extensive collection of synonyms and antonyms for any given word. With this powerful combination, you can effortlessly expand your vocabulary, explore alternative expressions, and delve into the nuances of language. Unleash the potential of speech recognition and ChatGPT as you navigate through a world where words find their perfect counterparts. Get ready to unlock new dimensions of linguistic exploration in the captivating Synonym/Antonym World!

Code

Logic

  1. Open PictoBlox and create a new file.
  2. Choose a suitable coding environment for Block-based coding.
  3. We create an instance of the Speech recognition.This class allows us to convert spoken audio into text.
  4. Next, we create an instance of the ChatGPT model called gpt. ChatGPT is a language model that can generate human-like text responses based on the input it receives. 
  5. Recognize speech for 5 seconds using recognize speech for ()s in the () block.
  6. Save the recognized result in the “input” variable.
  7. Use the “get(synonyms) of ()” function to obtain synonyms of the recognized speech result.
  8. ChatGPT will provide the answers for 10 response synonyms in the “Synonym World” based on the given input.
  9. Use the “get(antonyms) of ()” function to obtain antonyms of the recognized speech result.
  10. The output of the anonymous word of the given input will be displayed by the sprite.
  11. Click on the green flag to run the code.

Output

Read More
Learn to control Mars Rover using Dabble App on your device with customized functions for specialized circular motions.

Introduction

In this activity, we will control the Mars Rover according to our needs using the Dabble application on our own Devices.

We will first understand how to operate Dabble and how to modify our code according to the requirements. The following image is the front page of the Dabble Application.

Select the Gamepad option from the Home Screen and we will then use the same gamepad to control our Mars Rover.

Code

The following blocks represent the different functions that are created to control the Mars Rover for different types of motions. We will use the arrow buttons to control the basic movements.( Forward, Backward, Left, Right )

We will create our custom functions for specialized Circular motions of Mars Rover. We will use the Cross, Square, Circle, and Triangle buttons to control the Circular motions of Mars Rover.

Note: You will have to add the extensions of Mars Rover and also of Dabble to access the blocks.

The main code will be quite simple consisting of nested if-else loops to determine the action when a specific button is pressed on the Dabble Application.

You will have to connect the Quarky with the Dabble Application on your device. Make sure Bluetooth is enabled on the device before connecting. Connect the Rover to the Dabble application after uploading the code. You will be able to connect by clicking on the plug option in the Dabble Application as seen below. Select that plug option and you will find your Quarky device. Connect by clicking on the respective Quarky.

Important Notes

  1. The code will only run by uploading the code by connecting the rover with the help of a C-Type Cable to the Laptop.
  2. You will be able to upload the Python Code by selecting the Upload option beside the Stage option.
  3. There may be a case where you will have to upload the firmware first and then upload the code to the Rover. You will be able to upload the firmware in Quarky with the help of the following steps:
    1. Select the Quarky Palette from the Block Section.
    2. Select the Settings button on top of the palette.
    3. In the settings dialog box, scroll down, and select the Upload Firmware option. This will help you to reset the Quarky if any previous code was uploaded or not.
  4. After the Firmware is uploaded, click on the “Upload Code” option to upload the code.
  5. You will have to add the block “When Quarky Starts Up” rather than the conventional “When Green Flag is Clicked” for the code to run.

Output

Forward-Backward Motions:

Right-Left Motions:

Circular Left Motion:

Circular Right Motion:

Read More
Explore the surroundings with our obstacle avoidance Mars Rover that uses an ultrasonic sensor to detect and avoid obstacles. Learn how the robot moves, detects obstacles, and navigates its way through them.

This project of obstacle avoidance is for a robot that will move around and look for obstacles. It uses an ultrasonic sensor to measure the distance. If the distance is less than 20 cm, it will stop and look in both directions to see if it can move forward. If it can, it will turn left or right. If not, it will make a U-turn.

Logic

  1. This code is making a robot move around and explore its surroundings. It has an ultrasonic sensor that can measure the distance between objects.
  2. We will first initialize the servos of the Mars Rover with the block “Set head pins()”.
  3. Then we will make all the servos rotate to 90 degrees if they are not initialized.
  4. Thereafter we will initialize the ultrasonic sensors and define the minimum and maximum distance variables.
  5. The main logic of the code is that it first checks whether the distance is less than the minimum distance. If it is, the head servo will move to 45 degrees and check again if the distance is greater than the maximum distance, hence moving in the right direction.
  6. The robot with the help of the head servo, will check the distance for the conditions 90 degrees, 45 degrees, 135 degrees, 0 degrees and 180 degrees in the same order as stated.
  7. Whenever the distance measured will be less than minimum distance the head servo will change the direction to the next set of degree to check distance.
  8. In the last case scenario where all the angles contain obstacles as such, in that case the robot will change its direction to reverse by rotating to 180 degrees. By this way the robot will be able to navigate its own way through each and every obstacles.

Code:

Main Functions:

 

Final Main Logic:

Output

 

Read More
The examples show how to use pose recognition in PictoBlox to make jumping jack counter.

Introduction

In this example project, we are going to create a machine learning model that can count the number of jumping jack activities from the camera feed.

Pose Classifier in Machine Learning Environment

The pose Classifier is the extension of the ML Environment used for classifying different body poses into different classes.

The model works by analyzing your body position with the help of 17 data points.

Pose Classifier Workflow

  1. Open PictoBlox and create a new file.
  2. You can click on “Machine Learning Environment” to open it.
  3. Click on “Create New Project“.
  4. A window will open. Type in a project name of your choice and select the “Pose Classifier” extension. Click the “Create Project” button to open the Pose Classifier window.
  5. You shall see the Pose Classifier workflow with two classes already made for you. Your environment is all set. Now it’s time to upload the data.

Class in Pose Classifier

Class is the category in which the Machine Learning model classifies the poses. Similar posts are put in one class.

There are 2 things that you have to provide in a class:

  1. Class Name: The name to which the class will be referred.
  2. Pose Data: This data can be taken from the webcam or uploaded from local storage.

Adding Data to Class

You can perform the following operations to manipulate the data into a class.

  1. Naming the Class: You can rename the class by clicking on the edit button.
  2. Adding Data to the Class: You can add the data using the Webcam or by Uploading the files from the local folder.
    1. Webcam:

Training the Model

After data is added, it’s fit to be used in model training. To do this, we have to train the model. By training the model, we extract meaningful information from the hand pose, and that in turn updates the weights. Once these weights are saved, we can use our model to predict previously unseen data.

The accuracy of the model should increase over time. The x-axis of the graph shows the epochs, and the y-axis represents the accuracy at the corresponding epoch. Remember, the higher the reading in the accuracy graph, the better the model. The range of accuracy is 0 to 1.

Testing the Model

To test the model, simply enter the input values in the “Testing” panel and click on the “Predict” button.

The model will return the probability of the input belonging to the classes.

Export in Block Coding

Click on the “Export Model” button on the top right of the Testing box, and PictoBlox will load your model into the Block Coding Environment if you have opened the ML Environment in the Block Coding.

Script

The idea is simple, after running code we will do jumping jack activity in front of camera and tobi sprite will say counting of jumping jack.

  1. Select the Tobi sprite.
  2. We’ll start by adding a when flag clicked block from the Events palette.
  3. We made the new variable “count” by choosing the “Make a Variable” option from the Variables palette.
  4. Also we made the new variable “temp” by choosing the “Make a Variable” option from the Variables palette.
  5. Add “forever” from the Control palette.
  6. Inside the “forever” block, add an “analysis image from ()” block from the Machine Learning palette. Select the Web camera option.
  7. Inside the “forever” block, add an “if () then” block from the Control palette.
  8. In the empty place of the “if () then” block, add an “key () pressed?” block from the Sensing palette. Select the ‘q’ key from the options.
  9. Inside the “if () then” block, add the “Set () to ()” block from the Variables palette. Select the count option at the first empty place, and for the second, write a 0 value.
  10. Also add the “Set () to ()” block from the Variables palette. Select the temp option at the first empty place, and for the second, write a 0 value.
  11. Inside the “forever” block, add an new “if () then” block from the Control palette.
  12. In the empty place of the “if () then” block, add an “is identified class ()” block from the Machine Learning palette. Select the ‘Upper hand‘ option from the options.
  13. Inside the “if () then” block, add the “Set () to ()” block from the Variables palette. Select the temp option at the first empty place, and for the second, write a 1 value.
  14. Inside the “forever” block, add an new “if () then” block from the Control palette.
  15. In the empty place of the “if () then” block, add an “is identified class ()” block from the Machine Learning palette. Select the ‘Down hand‘ option from the options.
  16. Inside the “if () then” block, add the another “if () then” block from the Control palette.
  17. In the empty place of the “if () then” block, add a condition checking block from the operators palette block. At the first empty place, put the temp variable from the variables palette, and at the second place, write a 1 value.
  18. Inside the “if () then” block, add the “Set () to ()” block from the Variables palette. Select the count option at the first empty place, and for the second, write a 1 value.
  19. Also add the “Set () to ()” block from the Variables palette. Select the temp option at the first empty place, and for the second, write a 0 value
  20. Inside the “if () then” block, add an “say () for () seconds” block from the Looks palette block. At the first empty place, add the “join () ()” block from operator palette and at the second place, write a 2 value.
  21. Inside “join () ()” block at the first empty place, write the appropriate statement and at the second place, add count variable from Variables palette.

    Final Output

     

Read More
All articles loaded
No more articles to load
Table of Contents