# FAQ

Non exhaustive list of common questions from Reachy’s users. If you don’t see yours in it, don’t hesitate to drop a message on our support channel on Discord!

It’s very likely that you have a problem with reachy_sdk_server, the server running Reachy’s software. Check the Quick debug section.

Check the Quick debug section.

This means the motor has overheat and needs to cooldown. Turn the robot off, wait for it to cooldown and turn it on again.

This means the module encounter an issue (safety shutdown, incorrect instruction, etc.). You need to turn the robot off and on again to restart the module. For more information on what the blue led means, check the dedicated page.

You need to start the robot with the head in a position close to straight when starting the robot. Indeed, Orbita’s position encoder only covers part of the whole motion range. We use this starting position to recalibrate the absolute position.

Check the page Reachy’s computer is not running explaining why you couldn’t have Reachy’s computer on.

• If the SDK server is running, check Reachy’s cameras from the Python SDK documentation.

• If not, you will need to have a computer screen plugged to Reachy using an HDMI cable. To view the left camera, in a terminal in Reachy’s computer:

python3 python3 ~/reachy_ws/src/reachy_controllers/examples/view_cam.py left open_cv


You can use the autofocus available in reachy-sdk. For example, if you want to start the autofocus for the left camera, use:

reachy.left_camera.start_autofocus()


If the autofocus did not work, you can learn how to perform the focus manually with this page.

Orbita’s fan is managed automatically based on temperatures limits set for Orbita in Reachy’s software. You can’t control it using Reachy’s SDK.

There might be a problem with the HDMI connection or Reachy’s computer is actually not turned on. You can check the page Reachy’s computer is not running.

Edge TPU Coral device uses the pycroal python library. We use this device everytime we need AI in one of our applications, for example to classify objects for TicTacToe.

Check the Overall presentation of the Software section.

The current release does not come with an autonomous navigation stack. However, we have an internal (experimental) version where nav2 runs. If this is of interest to you, please let us know.