Quarky Robotic Arm - Documentation

Getting Started with Quarky Robotic Arm

Refer to the tutorials to learn about the Robotic Arm, and how to assemble and program it in Block coding or Python coding. 

Robotic Arm V6
Learn how to assemble the Quarky Robotic Arm with this step-by-step guide. Follow the steps to make the Quarky Robotic Arm look like the image shown in this guide, and use it to explore more complicated programs and activities.
“Learn to control a Quarky robotic arm remotely using Bluetooth and PictoBlox. Perfect for tasks like picking and tool manipulation, this project boosts productivity and safety in hazardous environments. Step-by-step coding guides included for precise X, Y, Z-axis, and gripper control.”
This activity focuses on programming a robotic arm using Python in PictoBlox, showcasing Python’s power in AI and ML. You’ll learn to initialize the arm, define movement functions for X, Y, and Z axes, control the gripper, and implement real-time control using a loop. This hands-on project sets the stage for embedding AI and ML into the robotic arm in future activities. Let’s get started!
Learn how to make a robotic arm autonomous, moving beyond manual control to execute repetitive tasks efficiently. By following the coding steps in PictoBlox, you can program precise movements and actions, making it suitable for applications in industries like manufacturing and medical fields. With careful calibration and testing, you can successfully transform your robotic arm into an autonomous system, enhancing its functionality and versatility.
Learn how to program a robotic arm to operate autonomously using Python in Pictoblox. This activity guides you through initializing the robotic arm, writing movement and gripper actions, and using a continuous loop for independent operation. You’ll also explore the transition from stage mode to upload mode, allowing the robotic arm to function without an external system or Pictoblox connection. Ideal for robotics enthusiasts and learners, this guide will help you take full control of robotic arms in a simple, efficient way.
this activity introduces the integration of machine learning into robotics by developing a hand gesture recognition model in PictoBlox. Through systematic steps, you learn to train, test, and export the model to control a robotic arm using gestures. By combining gesture analysis with robotic arm settings, this project highlights the potential of machine learning in enabling intuitive and precise control in robotics, paving the way for innovative applications.

Quarky Robotic Arm Project - Block Coding

Refer to the tutorials to learn how to use PictoBlox Block Coding Environment to code Quarky Robotic Arm for different applications

The project detects the number of faces detected on the stage.
The example demonstrates how face recognition works with analysis on the stage.
The example demonstrates how face recognition works with analysis on the camera.
The example demonstrates the application of face detection with a stage feed.
The example demonstrates how to use face landmarks in the projects.
The example demonstrates how to use face detection with a camera feed.
The project shows how to create custom patterns on Quarky RGB LED in Stage Mode.
The project shows how to create custom patterns on Quarky RGB LED in Upload Mode.
The project shows how to create custom patterns on Quarky RGB LED in Upload Mode.
The project shows how to create custom patterns on Quarky RGB LED in Upload Mode.
The project makes the Quarky display the expression according to the expression identified from the Face Recognition.
The example shows how to run image classification in Python on a webcam feed using OpenCV.
The example shows how to run image classification in Python on an image file using OpenCV.
The example shows how to run image classification in Block Coding.
The example demonstrates how to use the confidence threshold in face detection (Block Coding).
The example displays how to detect expression using face detection and mimic the expression on Quarky. The expression is detected by the camera.
The example shows how to create a face filter with Face Detection. It also includes how to make the filter tilt with face angles.
The example demonstrates how to use face landmarks in the projects.
The example demonstrates how face recognition works with analysis on the camera.
The example demonstrates how face recognition works with analysis on the stage.
The example demonstrates the use of clone and gliding function in Sprite. 
The example demonstrates how to make the sprite glide to a random position on the stage when it is clicked.
The example demonstrates how to use stamping and the mouse location sensing in Block coding.
The example demonstrates how to use keys sensing to control the movement of the sprite.
The example demonstrates the wall bouncing of the sprite and rotation style.
The example demonstrates how to make the sprite follow the mouse.
The example demonstrates how to add gravity into the project on a bouncing ball.
The example demonstrates how to implement mouse tracking.
The example demonstrates how to add movement to a sprite using the key detection hat block.
The example demonstrates the sprite direction in PictoBlox.
All articles loaded
No more articles to load

PictoBlox Extension Documentation

Refer to the documentation of PictoBlox Extension to understand how to use the blocks and functions for the Robotic Arm.