Jenny 5 -- the robot

Jenny 5 is a fully open-source robot intended to be used mainly for research but can also act as a human assistant. It has a mobile platform with rubber tracks, a flexible leg, two arms with 7 degrees of freedom each and head with 2 degrees of freedom. The robot is actuated by 20 motors (DC, steppers and servos) and its state is read with the help of many sensors. The robot also has 3 webcams for computer vision tasks. In this paper the current state of the robot is described.


Introduction
Jenny 5 represents an attempt to build a low-cost, almost humanoid robot, by using tools and materials which are readily available. Jenny 5 is inspired by the Johnny 5 robot from the Short circuit movie [6]. Work to Jenny 5 robot was started in April 2015. Until now 3 major versions have been built and tested. At each iteration the robustness of the robot has been significantly improved.
Jenny 5 (v3) has a mobile platform with tracks, a pliable leg, two arms with 7 degrees of freedom each and one head. The current design of the Jenny 5 is given in Figure 1. A real world image is given in Figure 2.
All source files (CAD, software etc) for Jenny 5 are freely available on GitHub [2,1]. All code is released under MIT license so that anyone can freely use it in both personal and commercial applications.
Several programming languages have been used for developing Jenny 5 hardware and software. These are: C++ (for server and the Arduino firmware), HTML5 and JavaScript (for the client controlling the server) and OpenSCAD (for the hardware design).
Jenny 5 is easy and cheap to build. Most components can be purchased from robotics stores. Some aluminum profiles can be cut and drilled with tools available for hobbyists. Custom made parts can be printed with a 3D printer.
Materials (excepting the onboard computer) cost less than 2500 USD. The robot can be utilized in a wide range of scenarios. Here is a short list of things that the robot could do (if programmed properly): surveillance, rescue, disasters management, house cleaning, food preparation, cleaning kitchen table, working in the garden, fire fighting, combat missions etc.
The Jenny 5 robot is continuously developed and updated. This paper describes the current state of the robot (also called v3).
The document is structured as follows: Section 2 contains the locations where the source code of the robot is stored. Section 3 contains a list of technical specifications for Jenny 5. Section 4 introduces OpenSCAD which is the software used to design the robot. Later the CAD project structure is given.
Section 5 describes the main components of the robot (mobile platform, leg, arms, body and head). A short list of materials and necessary tools is given in sections 5.1 and 5.2.
Section 6 describes the electronic boards used for reading sensors and for moving motors. Section 7 gives the specifications of the computer controling the robot. Batteries utilized in this project are listed in section 8. Section 9 describes the software that controls the robot (Scufy -the Arduino firmware, Scufy Lib -Arduino PC control library and the HTML 5 and Web-Socket server). Several intelligent algorithms used by Jenny 5 are described in section 9.5. Section 10 contains a short description of some future development directions.

Source code
Jenny 5 is open source and the entire source code is stored as a GitHub organization at: https://github.com/jenny5-robot/. Individual repositories can be downloaded from the above-indicated address. The Bill of Materials (BOM) and the assembly manual are also available online.
The source code is released under MIT license [23] so that anyone can use it free or paid, open-source or closed-source, private or public projects. The only request is to mention the author(s).
Below is the list of repositories and their web addresses: • CAD files -can be downloaded from https://github.com/jenny5-robot/ jenny5-cad. It requires OpenSCAD [7] to view.
• Scufy -the Arduino firmware -can be downloaded from https://github. com/jenny5-robot/Scufy. It is used by the arm and head electronics. It requires Arduino IDE [8] to run and upload to boards.
• Scufy Lib -Arduino control library -can be downloaded from https: //github.com/jenny5-robot/jenny5-control-module. It is used for sending commands to Arduino firmware from a PC. It is written in C++ and requires a C++ compiler.
• WebSocket server and HTML5 client -can be downloaded from https: //github.com/jenny5-robot/jenny5-html5. It is used to control the robot from a browser. The server is a console application running on the robot PC.
• the assembly manual -can be read from https://jenny5.org/manual/. Note that currently this is under construction and will not be completed until the robot is completed.

Specifications
Technical specifications are given in Table 1

Hardware design
This section describes the software used to design Jenny 5 and the structure of the CAD project.

CAD software
Jenny 5 is designed in OpenSCAD [7]. I have chosen this environment because it is mainly intended for programmers and I (the author) am a programmer. Parts in OpenSCAD are designed using computer instructions instead of using the mouse.
OpenSCAD IDE is very simple: one writes instructions in the left window, and the result (after pressing F5 ) are seen in the top-right window. In the bottom-right window the programmer can display various information by using echo instruction.
The language offers several primitives (cube, cylinder, sphere etc), several Boolean operations (union, difference, intersection) and several transformations (translate, rotate, mirror, etc). By using only these primitives and operations is possible to create very complex geometries.
The programmer can write everything in the main file, but usually the instructions creating a part are grouped within a module.
For instance a simple program which creates a nut is the following: For visualizing a part the user should press F5 (Preview ) or F6 (Render ). One can use the mouse to navigate through the designed part. Left button drag is used for rotate, right button drag for move and mouse wheel for zoom.
Note that for visualizing the entire robot we will use Preview mode only because Render is too slow (can take several hours to render the entire robot). However, when we need the stl file for 3D printing, we will use Render mode for that particular part.
The user can uncomment each module of the project to show it on screen.

CAD project structure
CAD files for the robot can be downloaded from https://github.com/jenny5-robot/ jenny5-cad.
There are 2 main folders there: basic scad and robot.
• basic scad -contains general components like screws, nuts, motors, sensors, pulleys, gears, bearings, etc. These components can be used in other projects too.
• robot -contains components specific to Jenny 5 robot. Each part of the robot has its own folder. Main file is called jenny5.scad. To view it just open it in OpenSCAD and press F5.
The robot folder contains 5 subfolders: • arm. The main file is arm.scad.
• base platform. The main file is base platform.scad • body. The main file is body.scad. Please note that arms are connected to body only in the main file of the project (jenny5.scad ).
• head. The main file is head.scad.
• leg. The main file is leg.scad.

Parameters
The current position of the leg and arms is stored as numerical values (angles). These parameters are stored in the following files: • robot/arm/arm params.scad -here we have the angles for arms.
• robot/leg/leg params.scad -here we have the angle for leg.
One can modify these parameters in order to simulate a new position for leg or arms.
One can change the color of the plastic parts from file basic scad/material colors.scad.

Hardware
This section describes the most important mechanical components of the robot which are the mobile platform (see section 5.3), the leg (see section 5.4), two arms (see section 5.5), a body (see section 5.7) and a head (see section 5.8).

Materials
In Table 2

Tools required to build Jenny 5
Tools required to build Jenny 5 are given in Table 3.

Mobile platform
The platform is driven by 2 DC motors with planetary gearbox controlled by a 15 Amps RoboClaw board [14]. The platform has two rubber tracks. Several components of the robot are placed on platform: batteries, motor controllers etc.  The platform is depicted in Figure 3. Currently, the platform is quite small and the robot can become unstable (can fall) if abruptly moved when the leg is fully extended. However, if the leg is compressed, the robot is much more stable. So, the idea is to make only moves with low acceleration when the leg is extended. In the near future we plan to add the possibility to move the body back and forth so that we can control the center of gravity easily.

Leg
The leg is driven by two, 750N, linear motors with 100mm stroke, controlled by a 5 Amps RoboClaw board. Note that the motors were disassembled in order to increase the useful stroke.
In Figure 4 we have a picture with the lateral and frontal leg design view. The leg can be compressed so that arms can grab things from ground.

Arms
Each arm has 6 stepper motors with various gearbox reductions (27:1 and 50:1). Joint position is read using a magnetic rotation sensor (AS 5147 [16]). Motors and sensors are controlled with a Pololu A-Star board [10]. The 50:1 motors are used for lifting the entire arm and for moving the elbow. Theoretically they are capable of a lot of force. According to the specifications [24] the motors (without gearbox) have 52N.cm torque. That multiplied with  50:1 reduction means 2600N.cm. The efficiency of gearboxes is 73%, so the useful torque is around 1900N.cm. The motor does not rotate the arm directly, but through the help of a 47:14 pulley reduction and this means that the final torque is around 6370 N.cm. At a distance of 70 cm (arm length) it means a 91 N.cm, which means that it can lift about 9kgs at 70cm. Note that this is a theoretical upper bound because the gearbox does not support more than 600N.cm before damage can occur [24], so the practical upper bound is below 3kg at 70 cm. Also, note that neither friction nor arm's weight were taken into account here. However the arm weight can be canceled if a spring is used. One more aspect to consider is that the torque of stepper motor decreases at high speeds [25], so again the practical upper bound for torque is lower than the one computed before.
Still, in our experiments (see for instance movie at https://www.youtube. com/watch?v=Rc-ppA9-12I we have been able to lift more than 3.5kg at 40cm. This means a torque of at least 1400N.cm and no gearbox damage was observed in long term.
Another problem is that 50:1 gearboxes are difficult to back-drive. In the near future we plan to replace the 50:1 gearboxes with 27:1 ones because the maximum allowed torque before damage can occur is similar.
Regarding the speed of the 50:1 gearbox motors we have observed and made the following computations: The maximum speed that we were able to obtain with the motor (no gearboxes attached was 1500 steps/second. After that speed the motor just stops). Knowing that the motor has 1.8 degrees steps, it means that we can about 7.5 complete rotations in 1 second. After the 50:1 gearbox we can make 0.15 of a complete rotation in 1 second. After the external 47:14 reduction we make 0.044 of a complete rotation in 1 second, that is about 16 degrees in 1 second. Thus, a complete up-down move (180 degrees) of the arm will take about 11 seconds. We are currently investigating other stepper motor drivers to see if more steps per second are possible.
Each arm is controlled independently. The electronic boards are placed on each arm separately.
The arm is depicted in Figure 5.

Gripper
Each arm has attached a gripper which is moved by a servo motor. The gripper is connected to the same board as the rest of the arm. The gripper has attached a web-cam used for recognizing the objects in the front of it. The gripper is depicted in Figure 6.

Body
The body is a carbon-fiber frame which has support for 2 arms and connectors for leg (at the bottom) and head (at the top). Body is depicted in Figure 7.

Head
The head has 2 degrees of freedom ensured by two Nema 11 stepper motors. Position of the motors is read by 2 AS5147 sensors. The head has a web-cam for object detection and an ultrasonic sensor for distance measurement.
All head components, except the camera, are connected to an Arduino Nano board.

Electronics
This section describes the electronics components utilized by Jenny 5. First we list the materials and tools required to build the electrical part of the robot. The we describe the custom PCBs build for arms and head.

Electrical / electronic materials
A list of materials is given in Table 4. Note that motors were listed in  This section describes the electronic used to control the robot.

Tools
Tools requires to build the electrical parts for robot are given in Table 5.

Arm custom PCB
The PCB for arm is depicted in Figure 8. The electronics was designed using Fritzing [20].

The brain
The brain of the robot is a 13" laptop, with i7-6700 processor, 128GB SSD, and 4GB RAM and is placed on the platform. The laptop has 3 USB-A ports (more ports is better, otherwise some USB-hubs are needed). The laptop sends commands to A-star, Arduino and RoboClaw boards and read images from webcams.
Obviously, any other laptop can be utilized, preferably small and powerful. A low-cost computer like Raspberry Pi [21] can be used too if the robot is not doing any high costs tasks like computer vision.

Power supply
The Jenny 5 robot is powered by multiple Li-Po rechargeable batteries. Currently a 16 Ah battery powers both arms and the head and a 10 Ah battery powers the platform and the leg. The laptop is powered by its own battery. 9 The software controlling the robot The robot components are controlled by various programs. Arms and head run the Scufy -the Arduino firmware which accepts commands sent from PC with the help of Scufy Lib. The RoboClaw boards run a proprietary firmware and accepts commands sent by a C++ library running on a PC.

Scufy -the Arduino firmware
A-star and Arduino boards run a specially crafted firmware called Scufy (see the source code in reference [3]) which is able to control multiple motors and read a variety of sensors: buttons, AS5147 magnetic rotary encoders, HC-SR04 ultrasonic, potentiometers, infrared, TeraRanger One etc.
The firmware is asynchronous and does not block the board when performing long running operations (such as moving a motor to a new position (which can take sometimes several seconds) or reading an ultrasonic sensor).
Scufy firmware accepts commands sent through the serial port. All commands are terminated by # character. Almost all commands send a response to the serial port. The response is terminated by # character.
Commands related to motors and sensors required an object (controller) to be created internally for each type of motor or sensor. This controller stores how many sensors of that type we have and more information about the Arduino pins where that motor/sensor is connected. etc. So, before moving motors or reading sensors the user must create a controller for them. The motors / sensors inside a controller are indexed from 0.
The list of most important Scufy commands is given below. The user should read the entire / updated list of commands from the source code of the program.

Create controllers commands
• CS n dir 0 step 0 en 0 dir 1 step 1 en 1 ... dir n−1 step n−1 en n−1 # Creates the stepper motors controller and set some of its parameters.
n is the number of motors, dir k , step k , en k are the Arduino pins which command the direction, step and the enable state.
n is the number of motors, pin k are Arduino pins where the motors are connected.
Outputs CV # when done.
n is the number of sensors, p k are Arduino pins where the sensors are connected.
Outputs CA# when done.

Attach sensors to motors
• ASx n P y end 1 end 2 home direction A k end 1 end 2 home direction# Attach, to stepper motor x, a list of n sensors (like Potentiometer y, Button z, AS5147 etc).
y is the sensor index in the list of sensors of that type. direction specifies if the increasing values for motor will also increase the values of the sensor.
Outputs ASx# when done.

Stepper Commands
• SM x y# Moves stepper motor x with y steps. If y is negative the motor runs in the opposite direction. The motor remains locked at the end of the movement.
The first motor has index 0.
Outputs SM x d# when motor rotation is over. If the movement was complete, then d is 0, otherwise is the distance to go.

• SHx#
Moves stepper motor x to the home position.
The first sensor in the list of sensors will establish the home position. The motor does nothing if no sensor is attached.

• SLx#
Locks stepper motor x in the current position.
• SSx speed acceleration# Sets the speed of stepper motor x to speed and the acceleration to acceleration.
Outputs SSx# when done.

• ST x#
Stops motor x.
Outputs ST x# when done.

• SGx position#
Moves stepper motor x to position sensor position. The first sensor in list will give the position.
Outputs SM x d# when motor rotation is over. If the movement was complete, then d is 0, otherwise is the distance left to go. Outputs V M x d# when done. If the move is completed d is 0, otherwise d is 1.

• V Hx#
Moves servo motor x to the home position. Home position is given by the first sensor in the list of attached sensors. Outputs V Hx#.

• RAx#
Read the value of AS5147 sensor.

• E#
The firmware can output this string if there is something wrong with a given command.

• Iinf ormation#
The firmware can output this string containing useful information about the progress of a command.

Scufy Lib -the PC library
Scufy Lib [4] is a C++ library that sends commands (as strings) and reads results to / from the electronic boards powering the arms, head etc. Communication between on board computers and the electronic boards happens on a serial port.
The following methods are a part of the Scufy Lib. The user should read the entire list of commands from the source code of the library.

Scufy Lib events
Strings received from Scufy firmware are translated by update commands from serial() to events which are stored into a queue. Each event has a particular type. A short list of event types is given in

Example of utilization
In this section we give some example of utilization for the Scufy Lib.
After sending a command to Arduino firmware, the PC program should wait for an answer in an asynchronous way. Since the answer is not instantaneous the programmer should create a loop where it waits for an answer. The basic idea is the following: • send a command, • use update commands from serial() to extract strings sent by firmware, • use query for event() to determine if a particular event has been received.

HTML 5 client and PC WebSocket server
The robot can be manually controlled by an HTML5 application running within the browser of a smartphone. The HTML5 application connects to the server running on the robot. The server is the one that actually execute the commands (move motors, read sensors) etc. The server is built on a top of a light WebSocket server (single source file) written by Eduard S , uicȃ [17]. The server uses TLSe library [18] for the secured communication protocol.
The server requires a certificate to run. A sample certificate has been generated and stored in the certificates folder of the server. This will work with no problems on smartphones running Android. However, for iOS a new certificate must be generated. More details on how to do this for iOS can be found in reference [22].
The Jenny 5 web client allows to control one motor (as in the case of arms) or maximum two motors (as in the case of platform, leg and head) at a time. In such scenario the utilization of the application is very simple: • the user presses a button (to select the motor that he wants to move), • the tilts the smartphone, • the client application read the gyroscope of the smartphone, • the client application send the angle to the server on the robot, • and the robot acts accordingly.
The client can request a picture from robot and then displays it in the browser window.
The web client also accepts voice commands. This feature is implemented by using Speech Recognition from HTML 5 [19].
An screenshot of the HTML5 client is depicted in Figure 9.

Intelligent algorithms
Currently there are two intelligent algorithms implemented on the robot: • An algorithm which finds the closest face and moves the head's motors in order to center it on the camera view. Only the head is involved in this operation. • An algorithm which follows the closest person. First it detects the closest face and then if the person is too close, the robot moves backward, otherwise it moves forward.
Both algorithm uses the OpenCV [12] for face detection.

Weaknesses and future development
Jenny 5 robot is under active development.
During the development of the Jenny 5 robot we have observed several weaknesses that we plan to address in future iterations.
Some short term ideas are listed in Table 7. To determine the true speed of the platform

Gyroscope on platform
To determine if the robot will fall