One of the most popular depth-sensing technologies in the market today, Time-of-Flight (ToF) technology is helping Autonomous Mobile Robots(AMRs) navigate seamlessly in their environments.
What is Time-of-Flight technology?
From being just a buzzword in the world of embedded vision systems to becoming a fully-fledged sensing solution used in a wide range of electronics, Time-of-Flight technology has come a long way in the past few years. The technology started off as a research paper published by the Stanford Research Institute in the late 1970s which theorized the use of laser sensors in 3D analysis. Technical limitations and challenges kept it from becoming a viable solution in the day, but today the technology is working in tandem with autonomous AI to find use in various fields.
One of these is helping mobile robots perceive their surroundings with great accuracy and precision. ToF is finding use in industrial environments like warehouses and production facilities and has made warehouse management easier and superfluous.
How has ToF revolutionized Autonomous Mobile Robots?
Time-of-Flight technology is part of an artificial intelligence revolution that takes stereo vision technology a notch higher to meet modern applications. ToF technology comprises of cameras that use active lighting and depth processing units to allow mobile robots to calculate their surroundings and thereby navigate better.
Time-of-Flight cameras can be mounted on top of mobile robots to extract 3D images at high frame rates with no further calibration required. Because of their active lighting system, these robots can perform tasks in lit conditions as well as in the dark. The data and information collected from a ToF camera is sent to a cloud-based application, which then derives insights. These insights can then be fed to applications and the AMRs to carry out tasks. In this way, ToF cameras are part of an AI revolution that helps mobile robots do what they were theoretically designed to do: handle tasks with little or no human supervision.
How does ToF help AMRs in closed environments?
ToF cameras are of great use to Autonomous Mobile Robots (AMRs) in indoor environments like industrial warehouses and facilities. These cameras perceive and analyze their surroundings with depth imaging data to help AMRs undertake business-critical functions with greater speed, ease, and accuracy.
The functions in AMRs where ToF camera can help them in navigating closed environments include:
Localization involves the AMR identifying its surroundings and its relative position in the environment. Traditionally this would require a GPS signal, which is often not possible in indoor environments. Hence, AMRs make use of localization for the particular indoor environment by using ToF cameras to capture 3D depth data and calculate their exact position through triangulation.
Mapping & Navigation
Using ToF cameras to obtain 3D depth data and process SLAM algorithms, AMRs can create an accurate map of their environment. The mapping helps AMRs create efficient predetermined paths within the premises, and navigate them with ease.
Obstacle Detection & Avoidance
In addition to creating predetermined paths in the premises and navigating them, ToF cameras scan the AMR’s surroundings in real-time, identifying objects or obstacles, and help the AMR navigate around them.
ToF camera technology and artificial intelligence cameras work in tandem with the other sensors in an AMR, like gyroscopes and wheel encoders, to provide the position of an AMR in real time with greater accuracy.
What is the role of ToF powered AMRs in automated warehouse operations?
Time-of-Flight (ToF) cameras have revolutionized traditional warehouse operations by equipping Automated Guided Vehicles (AGVs) and Autonomous Mobile Robots (AMRs) with advanced autonomous navigation capabilities. ToF cameras help these vehicles and robots perceive their surroundings and to undertake business-critical functions with greater accuracy and speed.
Implementing AMRs with ToF may significantly increase productivity and efficiency, while lowering labour costs and raising customer satisfaction, depending on the needs and current capabilities of an organisation. Embedded vision systems will pave the way for an increasing number of cutting-edge navigational innovations in the future as autonomy and technology both continue to advance, and prices fall.
By Mr. Maharajan Veerabahu, Co-Founder & Vice President of e-con Systems