Quarky Robotic Arm - Documentation
Getting Started with Quarky Robotic Arm
Refer to the tutorials to learn about the Robotic Arm, and how to assemble and program it in Block coding or Python coding.
“Learn to control a Quarky robotic arm remotely using Bluetooth and PictoBlox. Perfect for tasks like picking and tool manipulation, this project boosts productivity and safety in hazardous environments. Step-by-step coding guides included for precise X, Y, Z-axis, and gripper control.”
This activity focuses on programming a robotic arm using Python in PictoBlox, showcasing Python’s power in AI and ML. You’ll learn to initialize the arm, define movement functions for X, Y, and Z axes, control the gripper, and implement real-time control using a loop. This hands-on project sets the stage for embedding AI and ML into the robotic arm in future activities. Let’s get started!
Learn how to make a robotic arm autonomous, moving beyond manual control to execute repetitive tasks efficiently. By following the coding steps in PictoBlox, you can program precise movements and actions, making it suitable for applications in industries like manufacturing and medical fields. With careful calibration and testing, you can successfully transform your robotic arm into an autonomous system, enhancing its functionality and versatility.
Learn how to program a robotic arm to operate autonomously using Python in Pictoblox. This activity guides you through initializing the robotic arm, writing movement and gripper actions, and using a continuous loop for independent operation. You’ll also explore the transition from stage mode to upload mode, allowing the robotic arm to function without an external system or Pictoblox connection. Ideal for robotics enthusiasts and learners, this guide will help you take full control of robotic arms in a simple, efficient way.
this activity introduces the integration of machine learning into robotics by developing a hand gesture recognition model in PictoBlox. Through systematic steps, you learn to train, test, and export the model to control a robotic arm using gestures. By combining gesture analysis with robotic arm settings, this project highlights the potential of machine learning in enabling intuitive and precise control in robotics, paving the way for innovative applications.
Quarky Robotic Arm Project - Block Coding
Refer to the tutorials to learn how to use PictoBlox Block Coding Environment to code Quarky Robotic Arm for different applications.
The example demonstrates using the Quarky touch display to make touch piano.
The example demonstrates how to run different actions with the Quarky touch sensor to make a disco party in PictoBlox.
The example demonstrates how to use an ultrasonic sensor with Quarky.
The example demonstrates how to make the sprite movement with Quarky buttons.
The example demonstrates using the Quarky touch display to make a touch piano in the Python Coding Mode.
The example demonstrates how to use an ultrasonic sensor with Quarky.
The example demonstrates how to run different actions with the Quarky touch sensor to make a disco party in Python Coding Environment.
A waste management system that will differentiate the waste based on its type. If it detects biodegradable waste, the LEDs Quarky’s matrix will turn green. If it’s non-biodegradable waste, the LEDs will turn blue.
The example demonstrates how to run an object detection on the stage and show all the objects with confidence.
The example demonstrates how to detect persons on the stage with different confidence thresholds.
A waste management system that will differentiate the waste based on its type in Python Coding Environment. If it detects biodegradable waste, the LEDs Quarky’s matrix will turn green. If it’s non-biodegradable waste, the LEDs will turn blue.
The example demonstrates how to run an object detection on the stage and show all the objects with confidence.
The example demonstrates how to detect persons on the stage with different confidence thresholds.
The examples show how to use Pose Recognition in PictoBlox to count the number of body parts detected in the body.
The example demonstrates how to use hand recognition and pen extensions to make an air draw game.
The example demonstrates how to use hand recognition to track the different parts of the fingers.
The example demonstrates how to use human body detection to track the nose and make someone clown.
The example demonstrates how to use hand recognition to track the different parts of the fingers in Python Coding Environment.
The examples show how to use Pose Recognition in PictoBlox to count the number of body parts detected in the body in Python Coding Environment.
The example demonstrates how to use human body detection to track the nose and make someone clown.
The example demonstrates how to use hand recognition and pen extensions to make an air draw game in the Python Coding Environment.
The example demonstrates how to use sign detection and make the Quarky show the detection on the LED.
The example demonstrates how to implement sign detection in PictoBlox.
The example demonstrates how to find the closest sign from multiple sign detection and make the decision accordingly.
The example demonstrates how to make an object-tracking robot.
The example demonstrates how to make smart home automation for light control using NLP and Speech Recognition.
The example demonstrates how to make a QR Code reader.
The example demonstrates how to make a QR Code reader in the Python Coding Environment.
The example demonstrates how to make a sprite be fixed to a point but can rotate. The wand is hanging like a pendulum.
In this example, you understand the effect of the density, roughness, and bounce properties of the sprites. The ball falls from the top randomly, and the bell is fixed but can rotate.
All articles loaded
No more articles to load
PictoBlox Extension Documentation
Refer to the documentation of PictoBlox Extension to understand how to use the blocks and functions for the Robotic Arm.