“We strive for is to see companies rapidly build and program their robotics platforms, allowing greater time to focus on developing their unique intellectual property while still meeting market demand.”
Please tell us about your role at Cogniteam and you arrived here?
I am the co-founder and CEO of Cogniteam, which was founded back in 2010. At the time there was a need for enterprises to develop robots, which would allow them to address specific market needs.
In the last 12 years, the space has developed at such an exciting pace, now including enterprises, startups, and even at-home tinkerers. What was once an underground DIY community of robot builders has exploded, powered by greater online resources and applying automation software to today’s robotics, such as with our drag n’ drop Nimbus operating platform.
Prior to co-founding Cogniteam, I earned a Ph.D. in computer science with a concentration in Robotics & AI. I am also the former dean of Computer Science at one of Israel’s largest institutions of higher education, the College of Management and Academic Studies (COMAS)
Please tell us about your recent collaboration with AAEON and how this partnership would transform modern robotics?
Working together with AAEON is truly a meeting of the minds. We have a similar vision of streamlining robotics development and slashing the time needed for companies to deploy their latest products.
AAEON has pre-integrated the Nimbus Robotics Operating system into their hardware. In an exciting developmental leap, engineers can take an OEM’s off-the-shelf computer module and create a commercial product to deploy, manage, and monitor. This rapid pace is achievable without worrying about device interconnectivity and cloud connectivity on a large scale.
Once Nimbus is enabled, developers can access a massive library of field-tested software and AI algorithms to add to their robots including, Nvidia Jetson ISAAC SDKs. This gives teams the capability to skip over the 5-6 years it takes to develop a base hardware and software machine. From ideation to deployment can be achieved in as little as 18 months as engineers go straight to developing their proprietary designs. It also allows robots to gain cloud connectivity and benefit from over-the-air updates.
Thinking of every stage throughout the lifecycle, advanced simulations can be run before the robot has been deployed. Within these simulations, used to test how a robot will operate in the field, Nimbus can make suggestions on the ideal AAEON hardware to use. In addition, it can suggest whether to reduce power consumption or to increase capabilities to meet the developer’s requirements.
Read More: ITechnology Interview with Guy Caspi, Co-Founder and CEO at Deep Instinct
What are the major challenges in the robotics development techniques that impact the outcomes of modern integrations?
Great question! I can’t say it enough- modern robotics is all about integrations.
From the hardware perspective, companies and engineers are used to purchasing components from third-party vendors. These can include sensors, GPS, and a whole lot more.
With Software, we’ve seen many companies prefer to redevelop technologies themselves- even starting from scratch on capabilities that have been widely available for some time. The impact of this is a costly development process, lengthy testing, and difficulties in deployment.
Nimbus is all about integration, providing teams with the whole infrastructure from development through deployment and management along with ready-made algorithms that are integration ready.
How do you help robotics teams overcome these developmental challenges at Cogniteam?
We strive for is to see companies rapidly build and program their robotics platforms, allowing greater time to focus on developing their unique intellectual property while still meeting market demand.
Helping robotics teams means ensuring that the latest capabilities are available to them, including the latest hardware. For example, we had early access to the latest Nvidia Orin AGX system that allowed us to test, integrate, and conduct a controlled deployment using this fascinating hardware.
The same goes for location, special recognition, and other softwares that today’s robots demand as a baseline for development.
Could you please tell us more about Cogniteam’s NIMBUS and its relationship with NVIDIA’s Jetson package?
We are very honored to be a trusted Nvidia partner, bringing greater capabilities to their already impressive technologies. This granted us early access to the Jetson AGX Orin, allowing us time to ensure that it would sync with Nimbus, our robotic operating system.
This drag n’ drop system means companies can easily test components which were specifically designed to work harmoniously with the Nvidia Jetson package, including their latest release of the Jetson AGX Orin.
Through Nimbus, developers can rapidly integrate Nvidia ISAAC SDKs to boost the functionality of the robot in real time through localization, object detection, feature tracking, and much more. It also gives these robots cloud connectivity to learn new capabilities or patch software without the robot needing to return to a maintenance facility.
Read More: ITechnology Interview with Jeremiah Stone, CTO at SnapLogic
Please tell us what kind of product management technique you use at Cogniteam to deliver continuous improvement in a simulated environment? Do you use any kind of AI or machine learning algorithm to expand on your core offerings?
While we work with some of the world’s largest enterprises, we also recognize the need to remain nimble and functional, serving the needs of both corporate and startup engineers.
One way that we have achieved this is by allowing the simulation to run on a client’s browser without the need for continuous cloud connectivity. The user’s computer downloads the environment and runs the simulation locally, reducing the computing demand and lowering the cost of cloud computing.
The simulation is also ‘smart’, allowing users to configure software with the exact hardware that the robot will be using in the field. This insight is key for engineers who want to know how their actuators, cameras, sensors, etc. will react in various scenarios.
An example of this can be with a simple camera. Nimbus’ simulated environment will consider the environment, hardware, and software to have the simulated robot return an image that is slightly blurry. Teams can then understand if the problem is a result of out-of-date drivers or one of many other factors. The rich environment can then also layer people, objects, traffic lights, moving objects, and even other robots to better understand how a robot will act in a real-world environment.
All this is available on a standard laptop, without the need for a GPU.
What is the future of ethical robotics?
To expect a robot to act either ethically or unethically, it must first understand the normal behavioral patterns for the social environment it is in.
Leading the charge in Human-Robot Interaction (HRI), which will allow robots to understand how to interact with and around humans, Cogniteam is an integral part of Israel’s $17M HRI consortium. This collective of professional and academic organizations will help companies overcome significant roadblocks when building fully autonomous robots.
A simple way of considering how HRI will work is by giving the robots the ability to understand the body language of the humans around it. For example, if a larger robot is moving down a hallway and people are moving around unfazed, it can continue on. If it sees that people’s movements are hesitant, uncomfortable, or avoidant, it can take corrective actions. These may include slowing down, moving aside, or something else that will come off as less intimidating towards the humans around it.
Once these are finalized, these HRI capabilities will be available on Nimbus for instant integration with new and existing AI-driven robots.
Your advice to young robotics engineers looking to adopt to ever-expanding data science technology innovations for various fields, particularly in Cloud and Software development:
My response is always the same. Get out there and get your hands dirty!
There is so much to read and learn but ultimately, there is nothing like the joy and adventure in seeing pieces of hardware come to life. I’ve been doing it for decades and it still excites me.
For young engineers, I recommend getting a simple R&D platform with a few sensors. Learn how they work and consider, what were the technological and practical approaches taken by engineers to bring this concept into reality? How did they get it to work?
Next, learn the main building block algorithms that enable autonomy. Examples of these are localization, mapping (SLAM), navigation, object detection (using neural networks), and anything else that piques your interest to keep you going.
A technology event/ conference/ webinar/ podcast that you would like to attend in 2022-2023:
- ICRA 2023- https://www.icra2023.org/
- Automate USA- Detroit https://www.automateshow.com/travel
- Robotic summit and expo https://www.roboticssummit.com/
Read More: ITechnology Interview with Ronny Fehling, Partner and Director at BCG GAMMA
Thank you, Yehuda! That was fun and we hope to see you back on itechnologyseries.com soon.
[To participate in our interview series, please write to us at sghosh@martechseries.com]
Dr. Yehuda Elmaliah is a leader in the field of robotics development and human robotics interaction (HRI) with a Ph.D. in computer science- Robotics & AI. Prior to founding the robotics development, deployment, and management company Cogniteam, Dr. Elmaliah was the Dean of Computer Science at one of Israel’s largest institutions of higher education, the College of Management and Academic Studies (COMAS).
Cogniteam has been developing artificial intelligence technologies for robots for over ten years, working with blue chip companies on mapping, navigation, and autonomous decision-making. They have now packaged their unique suite of algorithms together as a single product in Nimbus, bringing developers a unique Cloud-based robotic artificial intelligence solution in one easy-to-use place.