Agriculture is transforming at an unprecedented pace. Roboticists strive to automate farm processes and year after year create machines for harvesting fruits and vegetables. A farm in New Zealand plans to launch a robot that will pick ripe apples from trees. This once again tells us that in the future machines will help us grow crops.

Your apples will soon be picked only by robots

Anna Samoydyuk

The robot, developed by Abundant Robotics, navigates rows between apple trees using lidar, or light radar, and searches for fruit using machine vision.

“The robot recognizes apples in real time. If the fruit is ripe, computer system tells the machine to rip it off,” explains Dan Steere, CEO of Abundant. Of course, she won't completely rip it off; rather, it will be swallowed - the hand uses a vacuum tube, with the help of which it “sucks” the fruit from the tree. The apple then goes onto a conveyor belt and from there falls into a bucket. The robot can do this around the clock.

There are many logical and technical reasons why such a robot did not appear sooner. When it comes to the evolution of agricultural automation, it's more likely to be a machete than a pair of scissors. Harvesters are widely used on farms to harvest wheat or cotton. Apple trees are trees, and you can't just drive a tractor over them to pick the fruit. “Neither the tree nor the fruit can be damaged. It requires a much more complex process,” explains Steer.

Automation of apple picking is mostly based on sensations - the robot not only identifies the fruits, but also analyzes their ripeness. After consulting with the farmer, the operator can configure the system so that the robot focuses on a specific color that will symbolize the ripeness of the apple.

You might think that the end of human farming is near. Before you start sounding the alarm about robots taking our jobs, it's worth remembering that automation is nothing new, especially in agriculture. Think about what happened to wheat. Before the advent of combine harvesters, thousands of workers worked entire fields by hand. So, it's no surprise that apples and other crops will soon see automation too.

Thanks to the robot, people will have free time and will not have to do physically hard work. Instead, they can either control the robot as it moves around the garden or pick up fruits it misses. This invention is very important for agriculture because the industry is experiencing a huge shortage of human hands. Automation is simply necessary in order to feed all of humanity.

What's also interesting is that we can now adapt crops to machines. You see, apple trees in New Zealand are not like those growing in your country house. While regular trees are voluminous and round, apple trees in New Zealand are flat. They look more like grapevines. This form of trees has many advantages: in addition to the fact that it is easier for humans and robots to reach the fruits, more sunlight falls on the apples. Thus, we must adapt not only the machines to the crop, but also the crop to the machines.

Yes, to some extent, agricultural robots will learn to adapt to any environment. But we definitely won’t be able to create one universal machine for harvesting fruits - the harvest is simply very diverse. In addition, robots will someday have abilities that are not available to humans - for example, super speed. Ultimately, they will help us ensure a sustainable food production system on a changing planet.

A person spends a significant part of his time doing such monotonous and monotonous household work as cleaning the room or working in the garden. Some people get real pleasure from this kind of activity, but for most, putting their living space in proper order is a routine, boring and not very pleasant task. Since the 50s and 60s of the last century, when the concept of a “robotic assistant” was just beginning to emerge, society was already dreaming of shifting part of its daily responsibilities to a soulless mechanized device, not subject to fatigue, stress and ready to do the dirtiest work. We are talking about robot servants and automated assistants, the prototypes of which appeared more than half a century ago.

The first mobile robot that analyzes commands and its actions

In 1966, engineers at the Center for Artificial Intelligence at Stanford University set out to create a robot endowed with the ability to independently navigate and move indoors without creating emergency situations. The project included the development of a design on a wheeled chassis with the possibility of self-learning, as well as a holistic analysis of the tasks assigned to the machine.

The device, called Shakey, was equipped with a set of sensors and a television camera to determine the current location and dimensions of objects surrounding the robot. In 1972, the Shakey project came to completion, embodying the advanced achievements of the engineers of the time in a single design. The mobile device demonstrated its capabilities in a special test pavilion consisting of several rooms connected by corridors. The robot followed the scientists' commands, pushing various objects, closing and opening doors, interacting with switches and various objects.

The promise of the algorithm embedded in Shakey prompted scientists to further work in this direction and create a number of more advanced automated mechanisms, as well as the introduction of the ability of this type of device to identify and respond to voice commands.

Cordless and autonomous lawn mowing

In 1969, MowBot Inc. introduced the world to a robotic lawn mower that operates from a built-in battery without the need to connect to home network. The battery charge was enough to cut grass on an area of ​​650 m2. And although the $795 device was very far from modern programmable “smart” devices that can be controlled even from a smartphone, the idea of ​​getting rid of wires turned out to be very interesting and received logical development.

Full-size robot Arok: both walks the dog and takes out the trash

What “house of the future” can do without robotic servants? A similar thought came to inventor Ben Skora, who presented his vision of futuristic, considering the 70s of the last century, homes with remote-controlled lamps and other technical innovations. Not without “smart” service personnel, whose place was taken by a two-meter robot Arok with a frankly creepy face.

The tasks of the mechanized giant included taking out garbage, serving drinks and even walking your four-legged pet. Of course, having an operator to manipulate the device was a must. So the staff of servants in the “house of the future” included an additional vacancy to supervise the robot assistant.

Omnibot, a popular gaming robot in Japan: background

Readers of 3DNews are very familiar with a device called Omnibot. But much less is known about its progenitor, which became one of the most compact robots of its time - Omnibot 2000. The unusual device was released in 1984, and it represented, as it does today, a super-technological and advanced autonomous model on the market of the most unusual toys of that time.

Omnibot 2000 had the ability remote control, however, the developers have also provided for completely independent movement of their brainchild along a predetermined route. All the data necessary for the programmed movement was recorded on a tape, and the robot could be used as a waiter to deliver food and drinks at a large party.

SynPet Newton: domesticated version of the “star” R2D2

If you liked the cute and sounds-making robot R2D2 from George Lucas' Star Wars saga, then you will be interested to know that between the late 80s and early 90s there was a commercial version of it for sale. analogue - SynPet Newton. Of course, this robot, approximately 86 cm high, cannot be called an exact copy of the legendary R2D2, but the similarity in design, as they say, is “obvious.”

SynPet Newton could move freely around the apartment, boasted voice control and helped with household chores. A 16-bit microprocessor chip was responsible for its performance, as well as a wide range of sensors for fully autonomous movement in accordance with the selected mode. At the same time, SynPet Newton could communicate with residents using a special voice synthesizer, and also provide communication between its owner and the outside world using the built-in cordless phone and a modem.

True, only the wealthiest Americans could afford SynPet Newton, because the price of the “smart car” was a fabulous $8,000.

The crown of evolution of humanoid robots from Honda engineers

Perhaps the most famous humanoid robot today is a device from Honda called ASIMO. It took engineers about ten years Japanese company, to ultimately bring the prototype's parameters to its current limit as a combination of high movement speed, extraordinary agility and advanced interaction with people.

ASIMO is able to greet guests with a friendly handshake and serve drinks no worse than a real waiter would do.

iRobot Roomba: responsible for the cleanliness of your home

Robot vacuum cleaners did not manage to become a common gadget in the homes of ordinary users due to their high cost. However, some models still had commercial success and took root in the apartments of their owners, as did one of the first mechanized home cleaners, iRobot Roomba. The main task of the device, which appeared on the market 12 years ago, is high-quality, and most importantly, completely autonomous cleaning of the most difficult types of floor coverings.

Humanoid robot Reem: both a loader and an information center

Have you often had to move around a station or airport building with bulky and heavy luggage, and at the same time try to find out the information necessary to board a flight? It seems that this problem in Spain, where the company PAL Robotics was based, prompted a team of four engineers to develop the Reem-A robot carrier.

Previously, the developers already had experience in constructing humanoid machines that take on the role of maintenance personnel. This made it possible in 2012 to introduce a commercial model of Reem with a telecontrol function, which is not only capable of carrying loads, but also acting as an information and reference kiosk.

Subsequently, the device was upgraded to the REEM-C version - both legs were returned to it, as was provided for in modifications with the index “A” and “B”.

Your personal robotic bartender for $2700

Setting aside procedures that require moving through space, lifting loads and complex mechanical manipulations, what could a small, stationary robotic device be useful for? Of course for making a variety of cocktails. The Monsieur robot has become an example of a skillful automated bartender who will not only prepare your favorite drink, but will also happily greet its owner upon returning home. For this purpose, the designers provided a function for determining your stay in the apartment using an application for mobile device, providing synchronization with Monsieur and apparatus control via Bluetooth and Wi-Fi.

The system is capable of not only fulfilling orders for cocktails remotely from a smartphone or tablet, but also offering you double portions of drinks if you are late at work and have had a very busy day.

The main feature of the 23-kg box with a touch screen is the number of cocktails it can prepare for guests at your party. The device includes 12 thematic variations - “non-alcoholic party”, “sports bar”, “Irish pub” and others, each of which has about 25 recipes for various drinks.

The implementation of the robotic bartender project became possible thanks to the Kickstarter crowdfunding platform, on which the Monsieur startup collected donations totaling $140 thousand.

Startup JIBO: if you are lonely and have no one to talk to

The JIBO robot, which was loved by visitors to the Indiegogo site, which brought the creators of the device over $2 million, will become a personal, sympathetic interlocutor, a polite, submissive and encouraging listener, regardless of your current emotional state.

The so-called social behavior model characteristic of JIBO, combined with advanced hardware and software components, will allow the device to find an individual approach when communicating with each family member. The device is able to independently identify the interlocutor, as well as capture his mood in order to choose the most appropriate behavior algorithm in the current situation.

JIBO, having wireless access on the Network, by voice request will find recipes for various dishes for the upcoming dinner, inform you about a new letter on your e-mail, will help with shopping, and also make appropriate jokes and entertain funny story and will brighten up a cloudy evening with a good musical composition.

Almost anyone can get an unusual robotic friend, because the price for JIBO is only $500.

Robots on guard

An excellent way to use robotic devices is to perform security functions. And indeed: thermal imagers, motion sensors, laser rangefinder, all kinds of cameras and “smart” systems, in theory, are capable of detecting an intruder much earlier, suspecting something is wrong and reporting a threat or an existing intrusion into a protected area than even an experienced person would do.

And if the brainchild of specialists from Knightscope is intended for passive observation and sending an alarm signal to the control panel, then, for example, the PatrolBot Mark II security robot is ready to independently counteract the intruder. To do this, its wheeled platform is equipped with a 100 dB horn and a water gun, with which the operator can literally ruin the reputation and clothes of the offender.


Robots have firmly entered our modern life. They tirelessly help people in factories, in hospitals, perform complex calculations and do not need to pay wages. The time when machines will start helping us with housework or teaching children has already arrived, just not everyone knows about it yet.

1. “Social” robot “Jibo”

"Jibo" is a cute little "social" robot with a personality that can easily become best friend families. A robot with a witty sense of humor has an unusual appearance and fun animations that spin and dance. "Jibo" is not just a toy, but a high-tech development. Using technology artificial intelligence, cameras and microphones, the robot is capable of studying the voices, emotions and faces of sixteen different people.


“Jibo” can perform a number of useful tasks: from setting an alarm, taking photographs, forecasting the weather, and ending with communicating with a person. At the same time, the robot selects intonation and phrases depending on the interlocutor. "Jibo" is already on sale for $900.

2. Robot genius “Professor Einstein”

The robot is designed as a caricature version of Albert Einstein and is positioned as an educational tool for people of all ages. In addition to a wide knowledge base, the robot also has a good sense of humor, which makes the learning process more fun. With Einstein you can not only learn the exact sciences, but also play various educational games.


The robot uses a cloud database to answer all sorts of questions. In addition to answering questions, “Professor Einstein” can conduct a dialogue with the interlocutor and has fifty options for facial expressions. The cost of the robot is $200.

3. Robot assistant “Aeolus”

Imagine a robot on wheels that, on command, brings a bottle of soda from the refrigerator. For those who are waiting for the ideal helper around the house - a dream come true. According to the website, the Aeolus robot assistant is designed to make your life easier by doing housework while you sleep. The robot uses artificial intelligence and technology machine learning to recognize objects from different angles and distances.


In place of the eyes, Aeolus has special wide-angle cameras with three-dimensional sensors. Thus, the robot can vacuum cleaner, wash floors, wipe dust, wash windows and much more. It even remembers where things are in the house so that if necessary, it can be returned to its place. Despite the fact that this is still a prototype, the creators said that the development will be available for purchase this year, and the price will not be less than a family vacation abroad, no matter how much that means.

4. “Smart” home robot “Aido”

"Aido" is a unique interactive home robot that can easily maneuver around the house. The creators of Aido built a set of functions into the robot based on a study of frequently requested options from electronic home assistants.


This family robot was designed to make its interaction with people as natural and intuitive as possible. Inside the Aido there is a subwoofer and speakers for a home theater system.
Interactive interaction with people is carried out by an innovative speech recognition system. The robot can help with housework, play with children, provide home security, and even perform programmed tasks. "Aido" by Ingen Dynamic Inc. available for pre-order for $499.

5. Family Friend "Buddy"

Buddy is an open source robot powered by software Unity 3D and Android. This social robot assistant from Blue Frog Robotics is a great companion for the whole family. He can greet you, remind you of events and even guard your home with a beaming smile on his face.


The robot's patrol route is quite advanced. “Buddy” is able to observe both one place and move between specified points. Like the other models we mentioned earlier, Buddy boasts advanced intelligence.

As a child I watched "Star Wars", saw the robots C3Po and R2D2 and dreamed of my own robot. This desire grew even stronger when I saw "The Jetsons" family robot housekeeper Rosie, who worked flawlessly doing housework. I've always felt that a personal robot could come in handy as your own chef, trainer, and companion. It seems that the future has come to our lives, because now more and more advanced models of robots for the home are appearing on the market and are even cheaper than an iPhone or MacBook. Let's take a look at 12 personal robots for your home: some are more human-like, others less so, but all of them can make your life better.

PepperfromSoftBank Robotics

Pepper is one of a line of robots that are most similar to humans. This robot is said to be able to recognize human emotions. Pepper detects your emotions and responds to them with the appropriate mood. Today, the Pepper robot is used in a variety of commercial applications in Japan, but it can also make a great friend at home.

Jibo

Jibo - cute little robot, which reminds me of the robot Wall-E from the Pixar film of the same name. It doesn't move on its own, but this family robot learns from everyone who interacts with it - and remembers everything you share with it. Jibo will talk to you the next time you enter the room, and don't be surprised if he makes a joke.

Kuri fromMayfield Robotics

Kuri is a fun friend and a good helper with serious technological content. The robot is mobile, equipped with WiFi, Bluetooth, a 1080p camera and facial recognition. It also has telepresence capabilities, which allows you to talk through the robot with other people at home. It can move around the house, avoiding obstacles and pets, and additionally act as a home surveillance system, since it hears and sees everything that happens in the house. Kuri makes beeping sounds similar to R2D2 from "Star Wars". Kuri can take daily photos and store content in an app on your phone - where you can view, edit and send this content to friends.


Zenbo fromAsus

Zenbo is a smart mobile robot that can communicate, help and entertain you whenever you need it. While you are at home, Zenbo learns and adapts to you, and when necessary, shares its emotions. Zenbo can help with reminders, control home devices, act as a security system while you're away, and even entertain kids by reading them stories.

Lynx fromUbtech

Lynx is a humanoid robot that provides mobility voice assistant Alexa. Lynx can order the items you need directly from Amazon using simple voice commands. Lynx comes with facial recognition and personalized greetings. It can play music and provide security in your absence by broadcasting what is happening in the house.

Budgee from 5Elements Robotics

Looking for another pair of hands to help you carry things around the house or yard? Then you need Budgee. Budgee is a friendly, hard-working robot that helps you carry things.

Hub RobotfromLG

Hub Robot from LG is a smart home assistant controlled by Alexa voice service. With its help, your home can be made smarter by simply installing the robot in the most active place in your home. The robot responds to your movements with nods and simple answers. It can take care of everything from setting the mood and playing music to turning the air conditioner on and off. The interactive display displays messages, videos, and photos. Using facial recognition, Hub Robot can recognize family members. This robot is currently not on sale, but we hope it will be available soon since it was first shown at the Consumer Electronics Show in January 2017.

Olly RobotfromEmotech

The Olly robot is a mix of a smart home hub and a personal robot. This round desktop device is designed to make your day better. Olly is a London-based robot startup that learns how you like to communicate, then answers your questions and controls your connected devices.

Robo Temi

Temi is a personal robot for the home: it's smarter than a telepresence robot, so it can take on the role of a personal assistant on wheels. Temi was designed to be a video chat and music machine - to keep you entertained and connected. Temi works for operating system Android, so it's compatible with many of your favorite apps.

Aido fromIngen Dynamic

Aido is a family home robot that can move around the house, help and improve your life. Aido can do it all: from playing with your children to helping around the house, managing everything you plan. Aido can keep your home connected and safe with mobile and visual capabilities.

Personal Robot fromRobot Base

This creatively named robot can do a lot. As we said earlier, Personal Robot is equipped with all the basic functions such as: facial recognition, photo capability, alarm clock, accurate language recognition and offline navigation. Additionally, it can create a map of your home using navigation and mapping algorithms. Personal Robot can also interact with other home devices, such as the Nest smart thermostat and others, so it can help automate your home.

Personal robotQ. Boopen source

Are you looking for 2 in 1: both a personal robot and a robot for experiments? Meet Q.Bo, an open source robot that allows you to add the features you need to create the most customizable robot possible. Q.Bo is equipped with basic functions and technical capabilities, but they can be expanded. This robot is great for children, parents and teachers - after all, you can invent and customize the kind of personal robot you want.

BONUS


Erica - Japanese robot based on Android

This robot is not yet being produced for a mass audience, I just thought it was worth including on the list to demonstrate the direction robots are heading. Erica reminds me of the bad robot from Westworld. It is possible that soon robots will be more human-like, like Erica, and will even be able to work among us, or maybe they are already among us - ominous music follows.
We are still at the beginning of the era of artificial intelligence and personal robots for the home. I am confident that this group of robots, which currently look like a new HTML web page during the Internet revolution, will be replaced by more advanced tools. However, it's exciting to see how things are moving towards the future - towards what we saw in big screens, and something that has been eagerly awaited for the past few decades.

It is much easier to be a man than to create a man. Take, for example, the act of playing catch with a friend as a child. If this activity is broken down into separate biological functions, the game ceases to be simple. You need sensors, transmitters and effectors. You need to calculate how hard to hit the ball so that it closes the distance between you and your companion. You need to consider sun glare, wind speed, and anything else that could cause distractions. You need to determine how the ball spins and how you need to receive it. And there is room for extraneous scenarios: what if the ball flies over your head? Will it fly over the fence? Will he break a neighbor's window?

These questions demonstrate some of the most pressing challenges in robotics, and also set the stage for our countdown. Here is a list of the ten most difficult things to teach robots. We must conquer these ten if we are ever to realize the promises made by Bradbury, Dick, Asimov, Clark and other science fiction writers who envisioned imaginary worlds where machines behaved like people.


Moving from point A to point B seemed simple to us since childhood. We humans do this every day, every hour. For a robot, however, navigating - especially through a single environment that is constantly changing, or through an environment that it has not seen before - is a daunting task. First, the robot must be able to perceive its environment and also understand all incoming data.

Roboticists solve the first problem by arming their machines with an array of sensors, scanners, cameras and other high-tech tools that help robots assess their surroundings. Laser scanners are becoming increasingly popular, although they cannot be used in aquatic environments due to the severe distortion of light in water. Sonar technology appears to be a viable alternative for underwater robots, but it is much less accurate in land environments. In addition, a technical vision system consisting of a set of integrated stereoscopic cameras helps the robot “see” its landscape.

Collecting environmental data is only half the battle. The bigger challenge will be processing this data and using it to make decisions. Many developers control their robots using a predefined map or composing one on the fly. In robotics, this is known as SLAM - a method of simultaneous navigation and mapping. Mapping here refers to how the robot converts the information received by the sensors into a specific form. Navigation refers to how the robot positions itself relative to the map. In practice, these two processes must occur simultaneously, in a “chicken and egg” fashion, which is only feasible when using powerful computers and advanced algorithms that calculate position based on probabilities.

Demonstrate dexterity


Robots have been assembling packaging and parts in factories and warehouses for many years. But in such situations, they, as a rule, do not meet people and almost always work with objects of the same shape in a relatively free environment. The life of such a robot in a factory is boring and ordinary. If a robot wants to work at home or in a hospital, it will need to have an advanced sense of touch, the ability to detect people nearby, and impeccable taste in choosing actions.

These skills are extremely difficult to teach to a robot. Typically, scientists don't teach robots to touch at all, programming them to fail if they come into contact with another object. However, over the past five years or so, significant advances have been made in combining compliant robots and artificial skin. Compliance refers to the level of flexibility of a robot. Flexible machines are more pliable, rigid ones less so.

In 2013, researchers at Georgia Tech created a robotic arm with spring-loaded joints that allow the arm to bend and interact with objects, much like a human hand. They then covered the whole thing with a “skin” that could sense pressure or touch. Some robot skins contain hexagonal chips, each equipped with an infrared sensor that detects any approach closer than a centimeter. Others feature electronic fingerprints, a ridged, rough surface that improves grip and facilitates signal processing.

Combine these high-tech arms with an advanced vision system and you get a robot that can give you a gentle massage or sort through a folder of documents, choosing from a huge collection.

Keep the conversation going


Alan Turing, one of the founders of computer science, made a bold prediction in 1950: one day machines will be able to speak so freely that you will not be able to tell them apart from humans. Alas, so far robots (and even Siri) have not lived up to Turing's expectations. That's because speech recognition is significantly different from natural language processing - what our brains do to extract meaning from words and sentences during conversation.

Initially, scientists thought that replicating this would be as simple as plugging grammar rules into a machine's memory. But the attempt to program grammatical examples for each individual language simply failed. Even determine the values individual words It turned out to be very difficult (after all, there is such a thing as homonyms - a door key and a treble clef, for example). Humans have learned to determine the meaning of these words in context, drawing on their mental abilities developed over many years of evolution, but breaking them down again into strict rules that can be put into code has proven simply impossible.

As a result, many robots today process language based on statistics. Scientists feed them huge texts, known as corpora, and then let computers break the long texts into chunks to figure out which words often go together and in what order. This allows the robot to “learn” a language based on statistical analysis.

Learn new things


Let's imagine that someone who has never played golf decides to learn how to swing a club. He can read a book about it and then try it, or he can watch a famous golfer practice and then try it himself. In any case, you can master the basics simply and quickly.

Roboticists face certain challenges when trying to build an autonomous machine that can learn new skills. One approach, as with golf, is to break down the activity into precise steps and then program them into the robot's brain. This requires that each aspect of the activity needs to be separated, described and coded, which is not always easy to do. There are certain aspects of swinging a golf club that are difficult to describe in words. For example, the interaction between the wrist and elbow. These subtle details are easier to show than to describe.

In recent years, scientists have made some progress in teaching robots to imitate a human operator. They call this imitation learning or learning by demonstration (LfD technique). How do they do it? The machines are equipped with arrays of wide-angle and zoom cameras. This equipment allows the robot to “see” the teacher performing certain active processes. Learning algorithms process this data to create a mathematical map of features that combines visual input and desired actions. Of course, LfD robots must be able to ignore certain aspects of their teacher's behavior - like an itchy or runny nose - and cope with similar problems that arise from differences in the anatomy of the robot and humans.

Deceive


The curious art of deception developed among animals in order to outperform competitors and avoid being eaten by predators. In practice, deception as an art of survival can be a very, very effective self-preservation mechanism.

For robots, learning to deceive people or other robots can be incredibly difficult (and perhaps good for you and me). Deception requires imagination - the ability to form ideas or images of external objects not associated with feelings - and a machine, as a rule, does not have it. They are strong at directly processing data from sensors, cameras and scanners, but cannot form concepts that go beyond sensory data.

On the other hand, robots of the future may be better at deception. Georgia Tech scientists were able to transfer some squirrel tricking skills to robots in the lab. First, they studied the cunning rodents, which protect their food caches by luring competitors into old and unused storage areas. Then they coded this behavior into simple rules and loaded it into the brains of their robots. Machines were able to use these algorithms to determine when deception might be useful in a particular situation. Consequently, they could deceive their companion by luring him to another place where there is nothing valuable.

Anticipate human actions


In The Jetsons, Rosie the robot maid was able to hold a conversation, cook, clean, and help George, Jane, Judy, and Elroy. To understand the quality of Rosie's work, just remember one of the opening episodes: Mr. Spacely, George's boss, comes to the Jetson house for dinner. After the meal, he takes out a cigar and places it in his mouth, and Rosie rushes forward with a lighter. This simple action represents complex human behavior - the ability to predict what will happen next based on what just happened.

Like deception, anticipating human actions requires the robot to imagine a future state. He should be able to say: “If I see a person doing A, then I can guess from past experience that he is likely to do B.” In robotics, this point has been extremely difficult, but people are making some progress. A Cornell University team developed an autonomous robot that could respond based on how its companion interacted with objects in its environment. To do this, it uses a pair of 3D cameras to capture images of the surroundings. The algorithm then identifies key objects in the room and makes them stand out from the rest. Then, using a wealth of information obtained from previous training, the robot develops a set of specific movement expectations from the person and the objects it touches. The robot makes conclusions about what will happen next and acts accordingly.

Cornell's robots make mistakes sometimes, but they make good progress as camera technology improves.

Coordinate activities with other robots


a single large-scale machine - even an android, if you will - requires a serious investment of time, energy and money. Another approach involves deploying an army of more simple robots, which can act together to achieve complex goals.

A number of problems arise. A robot working in a team must be able to position itself well in relation to its comrades and be able to communicate effectively - with other machines and a human operator. To solve these problems, scientists turned to the world of insects, which use complex swarming behavior to find food and solve problems that benefit the entire colony. For example, while studying ants, scientists realized that individual individuals use pheromones to communicate with each other.

Robots could use the same "pheromone logic" but rely on light rather than chemicals to communicate. It works like this: a group of tiny robots are dispersed in a limited space. They first explore the area randomly until one comes across a light trail left by another bot. He knows that he needs to follow the trail, and he follows, leaving his own trail. As the tracks merge into one, more and more robots follow each other in single file.

Self-copy


The Lord said to Adam and Eve: “Be fruitful and multiply and fill the earth.” A robot that received such a command would feel embarrassed or disappointed. Why? Because he is unable to reproduce. It's one thing to build a robot, but it's another thing entirely to create a robot that can make copies of itself or regenerate lost or damaged components.

What is noteworthy is that robots may not take people as an example of a reproductive model. You may have noticed that we are not divided into two equal parts. Protozoa, however, do this all the time. Jellyfish's relatives, hydras, practice a form of asexual reproduction known as budding: a small ball detaches from the parent's body and then breaks off to become a new, genetically identical individual.

Scientists are working on robots that can perform the same simple cloning procedure. Many of these robots are built from repeating elements, usually cubes, which are made in the image of a single cube, and also contain a self-replicating program. The cubes have magnets on the surface so they can attach and detach from other cubes nearby. Each cube is divided into two parts diagonally, so each half can exist independently. The entire robot contains several cubes assembled into a certain shape.

Act on principle


When we interact with people every day, we make hundreds of decisions. In each of them, we weigh each of our choices, determining what is good and what is bad, fair and dishonest. If robots wanted to be like us, they would need to understand ethics.

But as with language, coding ethical behavior is extremely difficult, mainly because there is no single set of generally accepted ethical principles. IN different countries There are different rules of behavior and different systems of laws. Even within individual cultures, regional differences can affect how people evaluate and measure their actions and the actions of others. Trying to write a global ethics that applies to all robots turns out to be almost impossible.

That is why scientists decided to create robots, limiting the scope of the ethical problem. For example, if a machine were to operate in a particular environment—a kitchen, say, or a patient's room—it would have far fewer rules of conduct and fewer laws to guide ethical decision-making. To achieve this goal, robotics engineers introduce ethical choices into the machine's learning algorithm. This choice is based on three flexible criteria: what good the action will lead to, what harm it will cause, and the degree of justice. Using this type of artificial intelligence, your future home robot will be able to accurately determine who in the family should do the dishes and who gets the TV remote control for the night.

Feel the emotions

“Here is my secret, it is very simple: only the heart is vigilant. You can’t see the most important things with your eyes.”

If this remark of the Fox from Antoine de Saint-Exupery's "The Little Prince" is true, then robots will not see the most beautiful and best in this world. After all, they're great at sensing the world around them, but they can't translate sensory data into concrete emotions. They cannot see the smile of a loved one and feel joy, or register the angry grimace of a stranger and tremble in fear.

This, more than anything else on our list, is what separates man from machine. How to teach a robot to fall in love? How to program disappointment, disgust, surprise or pity? Is it even worth trying?

Some people think it's worth it. They believe that robots of the future will combine cognitive and emotional systems, which means they will work better, learn faster and interact more effectively with people. Believe it or not, prototypes of such robots already exist, and they can express a limited range of human emotions. Nao, a robot developed by European scientists, has the emotional qualities of a one-year-old child. He can express happiness, anger, fear and pride, accompanying the emotions with gestures. And this is just the beginning.


Close