HF1: End of 2024 Recap

Posted on Jan 9 2025

As the year reaches its end, I wanted to share with everyone what the HF1 project accomplished in 2024 and what to expect in 2025.

Every sign of support is one more push for HF1 to become a real product. If you joined the waitlist, talked about HF1, liked or reposted my posts, thank you.

One of my New Year's resolutions is making HF1 a learning experience around a DIY kit. If you want that too, please spread the word. There’s a section at the end with ways in which you can help the project take off. 

 

What we accomplished in 2024

A lot has happened since I started working full-time on the project in mid April. The prototype’s hardware was mostly in place back then, from the time HF1 was just a fun way to pass the time on weekends. However, the software was just a bunch of hardware drivers running on a microcontroller board and a library for basic communication with a Jetson board.

Today, the robot is able to understand voice commands, localize itself, and follow pre-programmed trajectories with different expressive gaits. Moreover, it can be controlled remotely with an iPhone app. And, because it’s based on ROS2, it can be extended with components of all kinds from a rich ecosystem.

Let’s dive into each milestone:

Low-latency, secure speech understanding

I had seen talking robots calling APIs over the Internet. As cool as it was, the latency did not make them seem very fun to me. They did not feel alive. In addition, many folks had expressed their concerns about planting an always-listening mobile microphone at home.

It was very clear that HF1 had to follow a different approach. The latency had to be lower, much lower. And users had to have the freedom to keep their data private without losing basic robot abilities. Was it even possible to understand speech with low-power, affordable hardware? Would there be room to run other models concurrently for vision, planning and thinking in general? That was something I had to derisk as soon as possible.

After weeks of optimizing the onboard speech understanding system (and a little help from a new cooling fan), the result is responsive and accurate enough for the interaction to feel natural.

Accurate localization

HF1 has to know how its location changed over time to move around and gesture in response to the user. I equipped it with hardware for that: wheel encoders and an Inertial Measurement Unit (IMU).

This year, I created a localization algorithm that fuses the signals from the encoders and the IMU to accurately infer the pose of the robot as it moves. This algorithm runs on the microcontroller board at a stable rate. Thanks to it, the robot can execute repeatable motions.

Programmable trajectory controller

Once you know how much the robot moved in what direction, it is possible to correct its course and drive it where you want. This is achieved in HF1 with a number of controllers that, every fraction of a second, adjust the motors and servos until the robot reaches the intended pose.

I know it sounds complicated, but fret not; I dealt with all that so you don’t have to. The on-robot API lets you define a trajectory with waypoints, for both the robot’s base and head. You can then focus on the fun stuff, while the motion controller follows the trajectory and compensates for disturbances.

Oh, and the trajectory representation is powerful enough to define gaits. Watching the robot express its mood with its movements makes it feel alive.

Fast and robust inter-board communication

The Jetson board is meant to run all the high-level intelligence in the robot. That’s where decisions are made. And it has to communicate with the microcontroller board executing the motions. 

For that communication to happen, I created a special point-to-point protocol. I’m not going to bore you with the details, but it has nice properties that are not very common.

On the one hand, it can prioritize urgent messages to minimize delay. You could be transferring error logs, and the IMU readings or teleop commands would still arrive on time because they would take priority. 

On the other hand, the protocol is robust to connection interruptions. I like that because it reduces the cognitive load while developing: “what should I boot first?”, “should I reboot one board after reprogramming the other?”. Just restart the boards in any order and the protocol will manage.

ROS2 and Docker containers

This project started without ROS or containerization. Up until Summer, I had hacked my way to the core capability demos. It was the fastest path to gauge what was possible. I only had two software components on the Jetson and some functions on the microcontroller board. Life was good.

Moving forward, the system started growing, and not having a consistent way of componentizing new features was starting to slow me down. One day, I was resolving dependency conflicts; the next day, I was designing yet another communication protocol between processes. That’s when I knew the time for system architecture had come.

I ended up writing a module system for the Jetson based on Docker containers. The system avoids dependency conflicts thanks to the containerization of modules, and includes scripts to create, manage and orchestrate modules. New modules include all dependencies to create ROS2 nodes and interact with ROS2 nodes in other modules.

Remote control from the iOS app

We all love autonomous robots, but teleoperation is always an ace up your sleeve. You could use it to remotely help your stuck robot, teach the robot new moves, or simply have fun driving it as an RC car.

This year, I coded a teleoperation app. For now, the app incorporates joysticks to drive the robot and look around, but stay tuned. I plan to add more fun features over time.

My main use of the app this year was thoroughly testing the robot’s dynamics. That is roboticist code for “I play with (robot) puppets”. Check out these videos shot in puppet mode: HF1 rages against the printer, HF1 sees a specter on Halloween

Preliminary user study

This is not really technical, but a fortunate stroke of serendipity. My son brought some friends home one day and asked me to show them the robot. What I thought would be a 5-minute, please-dad-stop session turned into a full hour of them playing with the robot. They mostly said things and watched HF1 react. It was a validating experience. After all, we might be on to something. And they are looking forward to coming again to test new features!

My son’s friends are not native speakers. What was cool and unexpected was watching them try their best English for the robot to understand. Maybe there’s a language learning application waiting to be created.

Mentoring

This year, I’ve also tried to stay connected to the robotics community. Since I started Robot Fridays, I have had dozens of interesting conversations. Many of them with folks entering the robotics workforce. I consider myself very lucky for having had amazing experiences in the industry, and sharing my learnings feels like a great way to give back.

Need mentoring? Book your slot now.

 

What to expect in 2025

2025 is going to be full of feature work and productization. Here’s a list of a few topics that will need attention:

  • 3D-printed parts
  • Custom interconnect PCB
  • Support NVIDIA Jetson Orin Nano Super Developer Kit
  • Support Teensy 4.0+
  • Audio
  • Wider LLM support
  • Visual localization
  • Object and person detection
  • Developer APIs
  • Kit distribution
  • Online learning materials

 

How you can help

Join the waitlist

Joining the waitlist shows genuine interest in the robot. The waitlist helps me calibrate how many potential buyers there will be. I need to sell enough units to keep going. Plus, you will receive posts like this and other key project information before anybody else!

Spread the word

I love what I do, but I am self-funded and the project can only continue if there are enough people interested in buying the robot.

Please talk about HF1 with your friends and family, and send them to the project page. If they also get their robot, you’ll have robot buddies to share the fun with!

Contribute to the open-source repository

The motion control software running on the microcontroller board is open-source and is looking for collaborators. From writing unit tests to porting the code to other platforms, help is always welcome!

Talk to me

Questions? Suggestions? Feature requests? Strong opinions on code indentation? Please reply to this email. I’m always happy to get your input. Not about code indentation. That was a joke.

 

comments powered by Disqus