Team AVATAR-HUBO uses cutting-edge technology to create a human-centric functional robot that provides an immersive experience from a remote location to support multi-modal interaction in scenarios of real-world utility. Our team is motivated to create solutions for remote workers in the “future of work”. The “future of work” is an idealized future in which many of the quandaries residing in the workforce (such as handicap-inaccessible jobs, dangerous workplaces, and the “limitation of location”) will be eased with the help of robotic telepresence. Imagine a world in which the 13% of the US population that can’t work due to physical disability was granted the ability to commandeer robots to complete their jobs. Imagine a world in which human lives needn’t be risked to accomplish complex, important tasks (such as those found in disaster response). Imagine a world in which the best surgical oncologist in America can save the lives of multiple people on all seven continents daily. The possibilities are limitless once the main limiters (namely health, safety, and presence) of human abilities are expelled.
Team AVATAR-HUBO is striving to trailblaze the “future of work” by building a humanoid robot that uses haptic feedback, VR/AR technology, and advanced algorithms to allow anyone to possess the body of a functional robot without losing touch of the innately human factor of working. By equipping a VR headset, motion capture gear, and a haptic suit, one gains the ability to control the robot’s body, see what the robot sees, and feel what the robot feels. In a sense, one can transcend their body and control another, all in a streamlined and intuitive way, allowing for one to experience and manipulate their newly found surroundings in a surprisingly natural manner.
Team AVATAR-HUBO is a diverse team with a wide variety of experience in several skill sets. The team has extensive experience in tackling perplexing problems in robotics research and is led by one of the industry leaders in humanoid robotics research, Dr. Paul Oh. Some members of the lab were even involved in the 2015 DARPA Robotics Challenge (the “DRC”) where the DRC-HUBO@UNLV team placed 8th using the same humanoid robot that Team AVATAR-HUBO repurposed for the ANA Avatar XPrize. Since the DRC, the team has developed skills in Machine Learning, Haptics, and Aerial Manipulation.
Being made up of myriads of cultures and experiences, Team AVATAR-HUBO always has a plethora of viewpoints and approaches to developing the technology, leading to more innovation. Some members of the team are experts in VR/AR, humanoid robotics, computer vision, robotic controls, and countless other methodologies/technologies; in fact, the team has an abundance of experience working with platforms such as: “Spot Mini” from Boston Dynamics, “DRC-HUBO” from Rainbow Robotics, “ROBOTIS-OP” from ROBOTIS, the “MoCap System” from Optitrack, and many more. Additionally, many of the members of the team have spent time researching at the Naval Research Laboratory, Korea’s Advanced Institute of Science and Technology (KAIST), and NASA’s Jet Propulsion Laboratory. This extensive and diverse conglomeration of experiences, backgrounds, and opinions make Team AVATAR-HUBO not only competitive but a driving force in the budding industry of Robotic Telepresence.
The technology developed by the team will not only push the boundaries of robotics research but will revolutionize the future of robotic avatars, and quite possibly the future of work as well. The platform will make lives easier by connecting people around the world; a person in Las Vegas can immediately be present in Miami by simply setting up the equipment. Furthermore, this technology will propel society closer towards the “future of work”, providing self-efficacy and jobs to millions of disabled individuals across the country. Dangerous work environments (such as forest fires) will have fewer work-related injuries, too; for example, trained firefighters can pair their intuition and experience with the incredible precision of robots to fight fires more efficiently while remaining out of harm’s way. Additionally, experts in the field (e.g. expert firefighters) will be able to all respond to a situation at a moment’s notice, regardless of where they are on Earth.
The team is working to optimize a 3D visual and auditory sensing system similar to what people experience presently. The operator currently has a first-person (and soon-to-be third-person) point of view of the robotic avatar which grants them better situational awareness. These two technologies, when integrated harmoniously, elevate the standards for robotic teleoperation by providing a platform that can be used in myriads of applications in countless industries. Moreover, engineering low-latency frameworks for control and computer vision aids in remotely controlling the robot, greatly furthering the potential the robot has in specific fields, like the field of disaster response.
The robotic avatar that will be used for the competition will be a repurposed DRC-HUBO, a humanoid robot designed and built by the team's partner lab in South Korea, KAIST Hubo Lab, and in conjunction with the research performed by the team’s institution, the Drones and Autonomous Systems Lab (DASL). The robot contains 32 degrees of freedom giving it ample freedom to perform many actions. During the 2015 DARPA Robotics Challenge Finals, the humanoid was able to handle a power drill, drive a Polaris ATV, open doors, turn valves, handle power connectors, walk on uneven terrain, climb stairs, and clear rubble. Since humanoid robots resemble humans, they are able to perform nearly any action that a human can, as long as algorithms and methods can be developed to simulate human performance. These actions were demonstrated in the 2015 DARPA Robotics Challenge.
The team also has research experience testing DRC-HUBO in scenarios such as dynamic lift-and-carry of different materials, coordination with human coworkers, and VR teleoperation of the robot for disaster relief. For the ANA Avatar XPrize competition, the humanoid robot will be able to move omnidirectionally in walking mode, allowing the avatar to navigate around complex objects and maintain a dynamic footprint as humans can. Additionally, the robot possesses the ability to move around using a differential drive with wheels driven by motors mounted to the knees of the robot. This allows for the robot to only use computational resources for balance during necessary scenarios such as navigating uneven terrain and climbing stairs, and the robot can wheel around when on normally even surfaces for ease of use. DRC-HUBO is also equipped with over-actuated robot arms that allow the robot to manipulate a wide variety of objects based on end-effector position and orientation. The new hand that will be designed for DRC-HUBO will have force sensing capabilities in its fingers, allowing for real-time force reflection to the user during operation of the robotic avatar.
The team envisions the avatar being used as a remote worker. For example, the humanoid robot can go into a disaster scene like the Fukushima Nuclear Disaster and be controlled by someone in another country from a safe and monitored location. In addition to this, the operator will be able to have full situational awareness of the robot regardless of where it is located by using VR/AR technologies in conjunction with 3D scene reconstruction technology. Although it remains a work in progress, it is also the team’s goal to have the robot controlled by a EEG/BCI headset. In this scenario, the platform may be used by both able-bodied and non-able-bodied people, opening new horizons for those with disabilities to control robots across the world from any location in real-time. Combining these control frameworks, sensor modalities, and robotic platforms will allow for the creation of a robotic avatar that can operate remotely in real-time while providing an immersive experience and efficient control by multiple levels of operators for a wide variety of applications and services.
Team AVATAR-HUBO has access to state-of-the-art facilities in a wide variety of areas including humanoid robotics, virtual reality, unmanned autonomous vehicles, and service robotics. Our team works in the Drones and Autonomous Systems Lab (DASL) at the University of Nevada, Las Vegas (UNLV). DASL is housed within a 10,000 sq. ft. building near the main campus of UNLV and thus has large space available to test the avatar in a wide variety of scenarios. DASL's facilities include:
2 motion capture systems (1 large for drone testing, and 1 small for humanoid robots)
VR space with associated equipment (e.g. VR headset and controllers)
Humanoid robot testing area with 1 large motorized gantry, 3 mobile gantry units, and a variety of testing scenarios (valve turning, metal staircase, door opening)
Woodshop with laser cutter, screenprinter, and woodworking tools
Machine shop with enclosed CNC mill, PCB manufacturing jig, and metalworking/welding tools
Electronics shop with multiple 3D printers, soldering equipment, and electronic components
Augmented reality & projection mapping room with controllable light levels
1 functional full-sized humanoid robot (DRC-Hubo) & 1 social full-sized humanoid robot (Jaemi Hubo)
3 minitaure humanoid robots (ROBOTIS-OP1/2), 3 co-bot arms (1 HDT & 2 RB5), 1 service robot (Furo), 1 multimedia robot (Jay)
A large stock of various UAV types and sizes and associated control, service, and development equipment