Post by account_disabled on Mar 5, 2024 3:52:05 GMT
The Massachusetts Institute of Technology (MIT) developed Air Guardian, an artificial intelligence (AI) that will accompany airplane pilots to assist them during flights. According to the researchers, the software was designed to work in symbiosis with a person inside the cockpit and will only take control of the aircraft in the event of an emergency.
MIT's Computer Science and Artificial Intelligence Laboratory categorizes its AI as a collaborative control mechanism between humans and computers . The objective of Air Guardian is to support the pilot at times when his attention is divided wit Chinese Overseas Asia Number Data hin the control cabin of the aircraft. The scientists in charge affirm that this is an innovative position, since it is an AI that compensates for the limits of the concentration of the person responsible for the flight while respecting his autonomy.
Unlike an autopilot system, which turns on only in the event of a serious security breach, MIT's AI is proactive and preventative. Analyze the flight behavior and make small adjustments by interpreting the pilot's eyes.
“Imagine you are in a plane with two pilots, a human and a computer. They both have their hands on the controllers, but they're always looking for different things. If both pay attention to the same thing, the human can drive. But if the human gets distracted or misses something, the computer quickly takes control,” MIT summarizes in its official blog.
Collage of a person's face made from a photograph and cut pieces of colored paper
Artificial intelligence will rewrite the limits of our memory
New photo manipulation tools from Google and Adobe are blurring the lines between what's real and what's created with AI.
Eye tracking and interpretation in real time: this is how Air Guardian works
Air Guardian analyzes the pilot's focus of attention during flight using eye tracking. It also complies with a reading of information from monitors using the saliency mapping technique. This method involves breaking up an image so that a computer can understand its contents instead of just projecting the data. With the help of these attention markers, AI , based on liquid neural networks, identifies early signals of possible risks.
According to the MIT AI Lab report, their tool passed the intelligent piloting test. Both the pilot and the AI independently made similar decisions in controlled situations. Air Guardian also detected the shortest route to a waypoint and reduced the risk rate on its simulated trip.
"Our use of liquid neural networks provides a dynamic and adaptive approach, ensuring that AI does not simply replace human judgment, but complements it, leading to greater safety and collaboration in the skies," said Ramin Hasani, one of the principal investigators of the project.
The development of the AI received funding from the United States Air Force Research Laboratory, the Boeing Company, the Office of Naval Research and the United States Air Force Artificial Intelligence Accelerator. MIT shows enthusiasm for the project. Regardless of its immediate use, Air Guardian has the potential to be the basis of future collaborative control mechanisms in land vehicles, maritime vehicles and other activities involving the use of robotic extensions.
MIT's Computer Science and Artificial Intelligence Laboratory categorizes its AI as a collaborative control mechanism between humans and computers . The objective of Air Guardian is to support the pilot at times when his attention is divided wit Chinese Overseas Asia Number Data hin the control cabin of the aircraft. The scientists in charge affirm that this is an innovative position, since it is an AI that compensates for the limits of the concentration of the person responsible for the flight while respecting his autonomy.
Unlike an autopilot system, which turns on only in the event of a serious security breach, MIT's AI is proactive and preventative. Analyze the flight behavior and make small adjustments by interpreting the pilot's eyes.
“Imagine you are in a plane with two pilots, a human and a computer. They both have their hands on the controllers, but they're always looking for different things. If both pay attention to the same thing, the human can drive. But if the human gets distracted or misses something, the computer quickly takes control,” MIT summarizes in its official blog.
Collage of a person's face made from a photograph and cut pieces of colored paper
Artificial intelligence will rewrite the limits of our memory
New photo manipulation tools from Google and Adobe are blurring the lines between what's real and what's created with AI.
Eye tracking and interpretation in real time: this is how Air Guardian works
Air Guardian analyzes the pilot's focus of attention during flight using eye tracking. It also complies with a reading of information from monitors using the saliency mapping technique. This method involves breaking up an image so that a computer can understand its contents instead of just projecting the data. With the help of these attention markers, AI , based on liquid neural networks, identifies early signals of possible risks.
According to the MIT AI Lab report, their tool passed the intelligent piloting test. Both the pilot and the AI independently made similar decisions in controlled situations. Air Guardian also detected the shortest route to a waypoint and reduced the risk rate on its simulated trip.
"Our use of liquid neural networks provides a dynamic and adaptive approach, ensuring that AI does not simply replace human judgment, but complements it, leading to greater safety and collaboration in the skies," said Ramin Hasani, one of the principal investigators of the project.
The development of the AI received funding from the United States Air Force Research Laboratory, the Boeing Company, the Office of Naval Research and the United States Air Force Artificial Intelligence Accelerator. MIT shows enthusiasm for the project. Regardless of its immediate use, Air Guardian has the potential to be the basis of future collaborative control mechanisms in land vehicles, maritime vehicles and other activities involving the use of robotic extensions.