Skip to content

SpotMicro AI - How to build a self-learning Robot

License

Notifications You must be signed in to change notification settings

zebulon-86/SpotMicroAI

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SpotMicroAI

I started this Project because i got inspired by some very smart People/Companies and Projects out there and want to understand and adapt their work. It is based on existing OpenSource-Projects and uses affordable Hardware to enable other people to build their own Bots and help us to understand how to control it the way we want.

This Project is heavily work in progress and may change every day. It is NOT a working or even finished Project you might want to use.

PyBullet Simulation

See the first movements of SpotMicroAI on YouTube

Parts of this Project:

  1. build a working physical Robot with cheap components everyone can build
  2. create a simulated Environment and be able to control the Robot
  3. to do RL training to make it learn how to stand/walk/run

1. The physical Robot

SpotMicroAI

First of all thanks to Deok-yeon Kim aka KDY0523 who made this incredible work on Thingiverse

This basically is the physical Robot. It will take some Days to print and assemble all the Parts, but it's worth all the effort. I also sanded, primed and painted all the Parts to give it a nicer Look.

Here is my Thingiverse-Make

Since my setup required some additional Hardware, i recreated some parts using FreeCAD - see /Parts-Directory.

Parts

NVIDIA Jetson Nano

The Brain of all this is the JetsonNano. It has a 16 Channel PCA9685 I2C-Servo Driver connected, which is used to control the servos. The IMU (GY-521) is also connected via I2C and provides roll and pitch angles of the Robot. The OLED-Display is used to have some nice output. I will provide a Fritzing-Layout in the near future.

JetsonNano-Case

You can find the all the Code for the Jetson Nano here

Sensors

We will use Sonar Sensors instead of visual sensors like RGB or RGBD-Cams. Maybe i will try with ESPEyes or something in the near future.

Sensors used:

  • 4 x HC-SR04-Sensors. 2x as in the original model in the front looking forward/down. 2x at the bottom (front/back) looking down to measure the ground-distance.
  • An IMU MPU-6050 is used to measure pitch,roll and velocities. Yaw will be ignored since it drifts quickly.
  • RPLidar A1 - the cheapest Lidar i could find. Works. Connected to the Jetson via USB. Speedcontrol via PWM/Jetson

Also i have a SSD1306 OLED-Display and a NeoMatrix LED-Circle i want to include for the Style. In a first version i used an Arduino Mega as kind of Servo/Sensor-Controller and a Raspberry PI as Locomotion-Controller (communication via UART). But it showed up that the Arduino is too slow to handle Sensor-Signals and Servo-PWM properly at the same time.

I am not sure if the Hardware i use now will be enough to finally have a very smooth walking Robot like for Example the real SpotMini. See this more as a Research-Project where I try to use cheap Hardware and other People's Work to learn more about how this all works.

2. Simulation

PyBullet

I try to implement the Ideas of this Paper by Jie Tan, Tingnan Zhang, Erwin Coumans, Atil Iscen, Yunfei Bai, Danijar Hafner, Steven Bohez, and Vincent Vanhoucke Google Brain,Google DeepMind

Here you can see the first version of the URDF-Model.

urdf

And here the Model with working Kinematics in a PyBullet-Simulation.

Masses and Inertias of the URDF-Model are still not correct. There is also a Blender-File included which i used to create the STLs for the simulation. Of course you could also do some nice renderings with it! :)

Quickstart

This example can be found in the Repository. You need a GamePad for this to work:

pip3 install numpy
pip3 install pybullet
pip3 install inputs
...TODO: provide setup.py

cd Core/
python3 example_automatic_gait.py

Kinematics

In order to be able to move the Robot or event make it walk, we need something which tells us what servo-angles will be needed for a Leg to reach position XYZ. This is what InverseKinematics does. We know all the constraints, the length of the legs, how the joints rotate and where they are positioned.

You can find some a first draft of the calculations here. There is also a Jupyter Notebook explaining the Kinematics and a YouTube-Video.

3. Training

There is no real Training-Code yet.

Credits and thanks

  • Deok-yeon Kim creator of SpotMicro
  • Boston Dynamics who built this incredible SpotMini,
  • Ivan Krasin - https://ivankrasin.com/about/ - thanks for inspiration and chatting
  • Jie Tan, Tingnan Zhang, Erwin Coumans, Atil Iscen, Yunfei Bai, Danijar Hafner, Steven Bohez, and Vincent Vanhoucke Google Brain,Google DeepMind

About

SpotMicro AI - How to build a self-learning Robot

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 88.2%
  • Python 11.6%
  • Other 0.2%