AI & Robotics

To deliver our Series 4.0, we have applied our experience from the previous iterations and have developed a semi-autonomous driving system, allowing for new ways to interact with the world and the user.

AI & Robotics

Kiwibot started with a very simple concept:

  • Level 0: proof of concept - not a bot yet, but a prototype being remotely teleoperated to deliver at short distances
  • Level 1: this was the second iteration, the team assigned supervisors to the kiwibots at a high level of assistance. This ensured a safer and faster operation.
  • Level 2: Kiwibot 3.0 series was born during this iteration.Here we introduced partial automation, specifically we improved the perception.
    We utilized the kiwibot's sensors to understand how we can interpret the world, interact with the community, and find barriers in mobility that robots should face.

As you may know, in this 2021 we are launching our new Kiwibot 4.0 series, and to do so, we have applied our experience from the previous iterations and have developed a semi-autonomous driving system, allowing for new ways to interact with the world and the user. We are excited to introduce the world to our semi-autonomous Kiwibots and pave the way for the technological innovation of the future.

To ensure the safety of the community, we assign supervisors to monitor the urban environment and operations. Our supervisors are trained to take control of the robot if a warning or a critical event occurs. Our foremost priorities are respectively the safety of people, the environment, and goods.

The kiwibot 4.0 series are capable of performing an incredible driving function under different outdoor or indoor conditions.

The new perception system composed by high tech sensors allows us to get much information about the environment and the surroundings.

For example: the bots are able to detect traffic lights, people, vehicles or other kiwibots, and then take decisions in the path planning  and the obstacle avoidance process depending on the extracted features, the riks, the estimated trajectories and where these dynamic or static objects are located. We can also decide in critical scenarios such as crossing the street,  if the robot has to or change the speed limitations or stop immediately.

Fro  our HQs  we can activate what we call the corner to corner stack to take control of the robot and take it from corner A to corner B automatically without human intervention, but maintaining supervision all time. -> when the robot arrives at the end of the local trajectory, another algorithm takes control of the state and decides if the robot must turn 90 degrees, cross the street, or wait for the customer.

In addition to this semi-autonomous stack we are exploring, testing, and developing different environments for the Kiwibot. This allows high precision in a Kiwibot knowing where it has to go, and it can perceive the world and navigate smoothly through it in the right way, without crashing or falling and taking smart decisions all the time.

With smarter software, and sleeker hardware comes a better operation. If you want to learn more about the OPS in Kiwibot, click here!

To know everything we are doing in Kiwibot to change the world, click here to go to the full Keynote post.