Optimus Gains Functionality

On September 23rd, Tesla’s Optimus-focused account on the Everything App posted a new video showcasing some impressive new abilities.

First off, we get a view through the Teslabot’s eyes, as it performs calibrations with its arms - moving them back and forth while the bot’s neural network maps reference points so Optimus knows where its hands and feet are.

Then, we see it sorting some big lego blocks by colour - which might seem small, but is probably a bigger development than the ability to stand and walk without a cable attached.

This is because the robot has reportedly not been programmed to sort these blocks - at least not in the traditional manner. It’s been given the command to sort the blocks by colour, that much is obvious. But according to the Tesla team, the bot handles the rest entirely on its own.

Some of you might remember the demo at Tesla’s AI Day last September - and then the 2023 Shareholder’s meeting in May. During both of those events, CEO Elon Musk and Tesla’s engineers talked about using the same type of identification systems as the company’s cars use for their Full Self Driving software.

For Optimus, this means that the bot takes in visual data with its cameras, feeds it into its onboard neural net, and applies what it has learned to the task it's been asked to do. In short - the bot learns, and then remembers those lessons for later use - all without being hooked up to a larger network.

In the video we can see researchers messing with the bot by moving blocks around while it’s trying to grab them, and manipulating blocks that have already been sorted so that they are right-side up. You definitely can program a more conventional robot to do those things - but not to adapt in the way we’re seeing here - which is the strength of this type of system.

The only downside of course is that there needs to be some training process before a bot’s network gets this good at a task - or at least, that’s what’s being implied, because we still don’t know much more about the neural network aside from the idea that it works very similarly to the FSD network that guides Tesla’s vehicles.

But this little test is deceptively mundane. Sorting things is one of the tasks that Tesla is hoping their robot will be able to take over for humans, for one - making this achievement a better proof of concept than other, flashier tests.


But proving the bot can learn from past experience, and not get thrown off by environmental changes means that it’s ready to be trusted for this sort of task - and perhaps ready for more complex ones as well.


And as if to underscore that thought, the video ends with the Teslabot performing some yoga poses while balancing on one foot. It’s definitely safe to say that walking and physically manoeuvring was the easier part of building a robot like this.

Like we said, the updates for Optimus this year have been encouraging. Last September’s AI Day presentation was a little rough, but seeing the Teslabot walk and balance, and navigate work spaces while doing simple tasks does a lot to build confidence in the project.

We are very much looking forward to hearing more details about the neural network though - Tesla at least has to warn us if their bot isn’t 3-Laws Compliant.

Previous
Previous

Cybertruck Deliveries Loom

Next
Next

Turkish GigaFactory?