Where robots dream of electric sheep...

Monday, August 28, 2017

An alternative to the MuJoCo based OpenAI gyms: The pybullet environment for use with the OpenAI Gym Reinforcement Learning Research Platform

OpenAI gym is currently one of the most widely used toolkits for developing and comparing reinforcement learning algorithms. Unfortunately, for several challenging continuous control environments it requires the user to install MuJoCo, a commercial physics engine which requires a license to run for longer than 30 days. Such a commercial barrier hinders open research, especially in the perspective that other appropriate physics engines exist. To satisfy the large request by the community, we provide alternative implementations of the original MuJoCo environments which can be used free of charge. The environments have been reimplemented using BulletPhysics' python wrapper pybullet, such that they seamlessly integrate into the OpenAI gym framework. In order to show the usability of the new environments, several RL agents from the Keras-RL are configured to be trained out of the box. To further simplify the training of agents, a Trainer class was implemented which helps to capture commandline arguments in a unified fashion. The Trainer provides a set of standard arguments, but additional arguments can be defined by the agent and the environment to enable the researcher to provide special parameters to either one.

To use the environments from pybullet, install pybullet using pip in version 1.2.6 or higher.

pip install pybullet

The following things can be done using pybyllet.

  • You can enjoy pretrained environments:
python -m pybullet_envs.examples.enjoy_TF_AntBulletEnv_v0_2017may
python -m pybullet_envs.examples.enjoy_TF_HalfCheetahBulletEnv_v0_2017may
python -m pybullet_envs.examples.enjoy_TF_AntBulletEnv_v0_2017may
python -m pybullet_envs.examples.enjoy_TF_HopperBulletEnv_v0_2017may
python -m pybullet_envs.examples.enjoy_TF_HumanoidBulletEnv_v0_2017may
python -m pybullet_envs.examples.enjoy_TF_InvertedDoublePendulumBulletEnv_v0_2017may
python -m pybullet_envs.examples.enjoy_TF_InvertedPendulumBulletEnv_v0_2017may
python -m pybullet_envs.examples.enjoy_TF_InvertedPendulumSwingupBulletEnv_v0_2017may
python -m pybullet_envs.examples.enjoy_TF_Walker2DBulletEnv_v0_2017may

  • Run some gym environment test:

python -m pybullet_envs.examples.racecarGymEnvTest

  • Train an agent based on OpenAI baselines DQN:

python -m pybullet_envs.examples.train_pybullet_cartpole
python -m pybullet_envs.examples.train_pybullet_racecar

(the training will save a .pkl file with the weights, leave it running for a while, it terminates when it reaches some reasonable reward)

python -m pybullet_envs.examples.enjoy_pybullet_cartpole
python -m pybullet_envs.examples.enjoy_pybullet_racecar

For your own learning/training, create/import a specific Gym environment:

import pybullet_envs
env = gym.make("AntBulletEnv-v0")
env = gym.make("HalfCheetahBulletEnv-v0")
env = gym.make("HopperBulletEnv-v0")
env = gym.make("HumanoidBulletEnv-v0")
env = gym.make("Walker2DBulletEnv-v0")
env = gym.make("InvertedDoublePendulumBulletEnv-v0")
env = gym.make("InvertedPendulumBulletEnv-v0")
env = gym.make("MinitaurBulletEnv-v0")
env = gym.make("RacecarBulletEnv-v0")
env = gym.make("KukaBulletEnv-v0")
env = gym.make("CartPoleBulletEnv-v0")

If you want to enable human/GUI rendering in a Gym-created environment, call the env.render(mode="human") BEFORE the first env.reset.

Monday, July 10, 2017

How to run Processing headlessly and export your Processing visualization as an .svg using Apache Batik

For an evaluation at my company, I recently looked into different Java Drawing Library that make it easier to draw beautiful data visualizations than using plain Java awt. So I looked into Processing, a framework similar to d3.js but entirely written in Java. Its simple syntax and its powerful graphics convinced me quickly. However, another requirement of the evaluation was to be able to export the graphics into .svg format to make them scale very easSo my main question was if it is possible in any way to export .svgs from a Processing drawing. I had seen that maybe there exists a bridge to Apache Batik via the basic awt.Graphics2D, since both the SVGConverter and the PGraphicsJava2D used this as its base class. After a bit of coding trial and error, it seems that I could make it work, although a bit hacky. I just replaced the g2 JavaGraphics2D canvas from within the PGraphicsJava2D with an Apache Batik SVGgenerator (which inherits from JavaGraphics2D and is meant to be dropped into any awt compatible drawing method).  Important is that the replacing must take place after the beginDrawing() is called, as the beginDrawing() replaces the g2. The following example demonstrates that:

import java.awt.Dimension;
import java.io.StringWriter;
import java.io.Writer;

import org.apache.batik.dom.GenericDOMImplementation;
import org.apache.batik.svggen.SVGGraphics2D;
import org.apache.batik.svggen.SVGGraphics2DIOException;
import org.w3c.dom.DOMImplementation;
import org.w3c.dom.Document;

import processing.awt.PGraphicsJava2D;
import processing.core.PApplet;

public abstract class ProcessingDrawer {
 PGraphicsJava2D canvas;         // Processing graphics canvas 
 SVGGraphics2D svgGenerator;         // Apache Batik svg generator
 int width, height;         // width and height of the canvas
 String svg;
 public ProcessingDrawer(){
 public ProcessingDrawer(int width, int height) {
  this.width = width;
  this.height = height;
 public void run(){
  PApplet.main(new String[] {this.getClass().getCanonicalName()});

 public void settings() {
 public void setup() {
  // Get a DOMImplementation.
  DOMImplementation domImpl = GenericDOMImplementation.getDOMImplementation();

           // Create an instance of org.w3c.dom.Document.
  String svgNS = "http://www.w3.org/2000/svg";
  Document document = domImpl.createDocument(svgNS, "svg", null);

  svgGenerator = new SVGGraphics2D(document);     // Create an instance of the SVG Generator.

  svgGenerator.setSVGCanvasSize(new Dimension(width, height));
  canvas = new PGraphicsJava2D();
  canvas.setSize(width, height);
  * Implement this to draw your picture.
 public abstract void paintCanvas();
 public void draw() {
  getCanvas().g2 = svgGenerator;
  svg = exportSVG();
 private String exportSVG(){
  return exportSVG(true);
 private String exportSVG(boolean useCSS){
  String svgContent = "";
     try {
      Writer out = new StringWriter();
   svgGenerator.stream(out, useCSS);
   svgContent = out.toString();
  } catch (SVGGraphics2DIOException e) {
     return svgContent;

 public PGraphicsJava2D getCanvas() {
  return canvas;

 public String getSVG() {
  return svg;


Now you can extend the ProcessingDrawer as follows:

import java.util.Random;

public class SimpleSketch extends ProcessingDrawer{
 Random r = new Random();
 public SimpleSketch(){

 public SimpleSketch(int width, int height) {
  super(width, height);

 public void paintCanvas() {


  for(int i = 0; i < 30; i++){
   getCanvas().stroke(r.nextInt(255), r.nextInt(255), r.nextInt(255));
   getCanvas().ellipse(r.nextInt(width), r.nextInt(width), 60, 60);
 public static void main(String[] args){
  SimpleSketch ss = new SimpleSketch(400, 400);

This code outputs the following on your commandline:

which looks like this:

I publish this up here since I could not find any beautiful solutions on the web and so maybe anybody else is happy to use this approach. If you have any suggestions, just post below.

Monday, July 3, 2017

Tuesday, December 20, 2016

Evolutionary Algorithms with simulated rigid-body spiders

Recently I created an example for the Bullet Physics Example Browser. Think of an evolutionary algorithm applied to the controller of rigid-body spiders. Using the controls, one can change the dimensions of the legs and torso as well as the simulation speed and other parameters of the simulation. Here is a video of it running in the Bullet Physics Example Browser uploaded by Erwin Coumans: