Welcome to my site

My name is Ignacio (Nacho) Mellado. I build intelligent machines with visual perception and control. In this website you will find a selection of my projects, experiments and random thoughts.

How to get into robotics if you are a CS major

I was recently asked by a Computer Science (CS) major about a good roadmap to get into robotics. This is was my response, in case it can help someone else:

My approach to learning is normally by making. If that’s the case for you too, the best way to learn to make your own robot is starting to make your own robot. That should be the forcing function [...]

Linear algebra: to express locations and movements of the robot and other things in 3D, e.g. vectors, homogeneous matrices, quaternions… There's a lot of overlap with 3D graphics here. 

Synchronization: robots are almost always distributed systems; even when there’s only one processor, there are many independent sensors and actuators that can’t be queried/commanded [...]

How I hacked a vintage C++ compiler to support exceptions before they were standard

TL;DR I hacked a 28-year-old C++ compiler to support exceptions. That’s two years before exceptions were even part of the first C++ standard!

The compiler is the Watcom C++32 Optimizing Compiler Version 11.0, from 1996, by Sybase, Inc. Around that time, some friends and I started a real-time graphics [...]

Fig. 1: The Watcom C++32 Optimizing Compiler Version 11.0 running on DOSBox.

Years later, I resumed maintaining the library. Real-time code was mostly written in assembly, but I started adding C++ support to integrate faster. However, with error checking everywhere, the code started [...]

HF1: Home companion robot
HF1 is a companion robot for the home. It understands you. It interacts with you. It keeps your data private.

The Perceptive Portable Device
Can a smartphone perceive the environment like a human does? Portable devices are full of sensors, but they are still very limited to understand what is happening from a human perspective: Where am I inside the building? Is my user healthy? Is the baby crying? This side project is my quest to give portable devices such capabilities.
Autonomous LinkQuad quadcopter with Computer Vision
MAVwork, my open-source framework for visual control of multirotors, is now supporting a new quadcopter from UAS Technologies Sweden. Everything was tested with a speed control application. Watch a semiautonomous flight of this new elegant drone.
Autonomous Pelican quadcopter with Computer Vision
Check how the versatile Pelican from Ascending Technologies acquired basic automatic take-off, hover and landing capabilities thanks to MAVwork, the open-source framework for drone control. Watch the open MultirotorController4mavwork in action.
Camera localization with visual markers
There are tons of applications where it is key to have the accurate location of things in a workspace. With these cheap and easy-to-build visual markers, you can know the position and attitude of anything with a camera on it. They block less visual space and offer less air resistance than equivalent-size 2D codes.
MAVwork released for Parrot AR.Drone
MAVwork is a framework for drone control that was born in 2011 during a short research stay in the Australian Research Centre for Aerospace Automation (ARCAA). Read about the inception of MAVwork and watch a video of the first test controller for a Parrot AR.Drone with a Vicon system.
Laura: Self-driving public transportation. Prototype II.
Discover how this 12-ton truck was automated to drive itself with Computer Vision. This was the second prototype, after a Citroën C3, in a project led by Siemens to develop a self-driving public transportation system.
Laura: Self-driving public transportation. Prototype I.
High buildings blocking GPS signal, lane markings and road signs hidden by traffic, ... Cities can be a very harsh environment for a driverless bus trying to know where it is and where to go. In this project, led by Siemens, I explored a solution with Computer Vision.