top of page
Writer's pictureCraig Whitton

Sunday Story: Some More Real-World AI Applications

Updated: Jul 10

This week, we’re sharing with you some more real world AI applications - largely because we’re about to launch some of our own! Stay tuned for more details but we are excited to bring our first one to you next week, and our second one in the weeks after that. These will be AI-powered web apps that will help supercharge your leadership and prepare you and your teams for the transformation that’s ahead - and after reading this week’s Sunday Story, we have no doubt that you’ll agree that transformation (or disruption!) is coming.


A few weeks ago we talked about how AI is about to really hit the mainstream - and in many cases already has - and the week before that, we talked about how AI has supercharged robotics. A few months ago, in our Disruption Series, we showed you the example of the Figure robot able to do some incredible things. This week, we’re showing you two examples of how this technology is being commercialized today. As a leader of people, you need to be aware of this technology - the folks you lead in your workplace are going to be nervous about it, your bosses will eventually expect you to implement it, and the way we navigate this conflict is going to be important. The time to prepare for that navigation is now, so you have the chance to transform your context instead of struggle through the disruption of it.


We’ve mentioned this example before, but remember a while back when self-checkouts started popping up everywhere at grocery stores and at airports? They didn’t bring this technology in because big corporations had a soft-spot for the introvert grocery or airplane ticketing experience. They did this because automation saved money on the bottom line. The below examples are like the self-checkouts of today - there’s a huge incentive to bring this technology to market, because it will save businesses a lot of money over time and fundamentally change how work gets done. Let’s take a look.


First up, we have an update on the Chat-GPT powered Figure robot. Figure signed a deal with BMW to have one of their humanoid robots working in the BMW Factory. That may not seem like a big deal, because robots have been working in car factories for a long time - as a fun fact, the first human death caused by a robot was by a car factory robot, and some (I hope jokingly) point to that as the day the Robot Uprising began! But this is different, and I’ll tell you why.


A simple explanation is this: Traditional automotive robots were anchored to a place. They were programmed in such a way where they know to extend their welding arm 3 feet out and 2 feet left, trigger the welding torch for 4 seconds, then move the robot arm 2 feet right and 3 feet in. This is the exact same pattern every single time. For that to work, everything needs to be in the exact same place every single time - the thing they are welding has to be firmly anchored to the assembly line in the exact same way as every other piece they are welding. The pieces themselves have to be manufactured to be the exact same thickness and shape. There’s a great deal of precision involved in this, and the automotive robot is essentially just repeating a highly precise task, time after time, in the exact same way. And, that particular robot can only do that one task; if it needs to be moved elsewhere in the factory, it has to be totally reprogrammed and it’s a significant investment to get it to do something different - a human has to measure the precise movements and put them into the robot’s code. Here's the Figure robot in action:





The Figure robot is doing something remarkably different. As you’ll see in the video, it’s doing the following:


  • It’s using its robot eyes to see the objects, and then it’s figuring out how the objects should fit together based on objects it has see before. It doesn’t matter if the objects are misaligned or in different starting positions - it’s putting things together the same way a human would. “This looks like piece A. It has to line up with piece B. So I need to move my hand and grab it, and move it over here, and put it onto Piece B”.

  • It’s walking and dynamically navigating it's environment - this means it’s not fixed to the ground and only able to repeat the same task over and over again. This means it could theoretically walk onto the back of a truck, find the right piece, walk off the truck and put it in the right place.

  • It’s figured out how to do all this in simulation, just like we talked about previously - meaning not in the real world, and that means it’s learning is not limited by time but instead only by processing power. It can “practice” putting these pieces in the right place a million times in just a few minutes or hours, whereas for the traditional automotive robot to practice putting pieces in place, it would have to do that in the real world, taking a real amount of time for each piece.

  • It’s correcting its mistakes. At one point, you see a piece doesn’t sit quite right. The robot knows how to move it’s hand and touch the piece in just such a way as to correct the placement of that piece.


Let’s take a look at another robot from a different company doing another task that used to be in the realm of humans: Picking food.





In this case, it’s a robot that picks tomatoes so whether it’s picking fruit or vegetables is up to you, but the key points here are as follows:


  • Like the Figure robot, this Hitbot robot is using its robot eyes to see the world and make decisions based on what it’s seeing. In this case, it’s looking for only the ripe tomatoes on the vine.

  • It’s able to then move on its own to pick the tomatoes that are ripe and get them into the basket, ready to head to the market. Its ignoring the tomatoes that are not ripe and leaving them on the vine.


This robot is still very early stages, but it’s doing a job that has been solely the domain of human beings for thousands of years. There have been automated pickers before, but not smart automated pickers, and this application of AI into robotics in this way is a game changer for how we procure our food.


The economic incentive here is clear: if a Robot can automate a task that was traditionally the realm of a human, there’s a massive economic incentive for corporations to invest in that technology. Robots do not require vacations or benefits. They do not form unions to demand higher wages. They do not take sick days, nor do they get tired. We’ve had robots for a really long time, but the power of Artificial Intelligence is massively increasing the capabilities of these robots every single day, and this trend is going to continue as hundreds of millions of dollars are invested in commercializing this technology for the future. We already have robots in our homes and lives - Roombas and self-checkouts - and soon we are likely to see a lot more of that every day.


Our Sunday Stories will always be written by humans though - and on that note, we’ll see you next week! Thanks for reading.

20 views0 comments

Comentarios


Post: Blog2_Post
bottom of page