Artist impression of a neural network that connects the observations (left) to the models (right). Credit: EHT Collaboration/Janssen et al. (Phys.org, Self-learning neural network cracks iconic black holes)
Self-learning networks are tools. That can do many things better than humans. The self-learning network has two datasets. That it can be used in that process. The system has models in its databases. And then the tools that can send observations in that system. The self-learning neural network means that the system compares the observation with the model.
And if that model is different. The system fixes it. The system has the tools that can handle those images as pixels. The system can change those pixels in the model. That can make it fit with observation. And we can use that model with all learning networks.
The system can create models itself, or it can use humans to make them. Then that system sends things like drones to operate following the model. Successful missions like pizza delivery or some military action mean that the system has a suitable model. But if the mission is not complete that model requires advance. So if something goes wrong the system requires information on what went wrong. And then the operators should fix the model. In the case of pizza delivery, those operators will fly that mission the first time, and then the system creates the model using that data.
And that is one way to teach the AI and network to deliver pizza to the right place. The system can use the image of a person who ordered the pizza and finds that person also from outside. In the teaching process, the system needs things. Like the minimum flight altitudes.
Above: SingularityHub, This Robot Swarm Can Flow Like Liquid and Support a Human’s Weight)
The same software that recognizes vehicles, can recognize people. The person orders pizza at a certain GPS point or point, where the drone can find easily. The person can also give the image to the drone.
In other cases. The operator can draw the route. To the delivery point on the city map. The drone knows what streets it should follow.
The drone can scan things like street names. And then it can find places like certain shop entrances. Then drone can start to search for the person who made an order. If the person gives a personal image to a drone it can search that person from the squares. The same system can also find targets for attack drones. The problem with drones is that they are multipurpose tools. And learning networks can make them more fantastic, and more terrifying than nobody believed.
The Ukrainian strike against the Russian strategic air force tells how dangerous those systems can be. The drone can be installed in things like trucks and the driver must not even know that they are there. The drone can be on the roof of the container. When it enough close to the target it can release that hatch and then those drones can fly to their targets. The thing is that those kinds of systems are far more advanced than in 2020. Those systems are fast to develop and with the morphing and cheap AI, they are absolutely effective. The AI can be hard to make. But it's cheap to use.
The new drone swarms can operate like liquid. Other drones can transport them to their targets. The drone swarms can act like liquid metal robots in movies. The drone can actually be formed of that kind of robot swarms. So those drone swarms can travel in the form of a drone.
Then at the target, they can fall to water. And the drone's shell can turn into a liquid metal amoeba. Then that liquid metal amoeba can do its duty. It can close oil leaks. Or remove cancer from the human body. There are lots of applications for those robots that can make the morphing structures possible,
https://www.bloomberg.com/features/2025-ukraine-drones-explainer/
https://phys.org/news/2025-06-neural-network-iconic-black-holes.html
https://singularityhub.com/2025/02/24/this-robot-swarm-can-flow-like-liquid-and-support-a-humans-weight/
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.