“Artificial intelligence is often described using language borrowed from human thought, but those word choices may quietly reshape how its capabilities are understood. Researchers analyzed large-scale news writing to see when and how mental language appears in discussions of AI. Credit: Shutterstock”. (ScitechDaily, Does AI Really “Know” Anything? Why Our Words Matter More Than We Think)
The AI doesn’t think as we do. When AI recognizes an object, it searches databases. And tries to find the action that matches that object. The thing that makes this process hard for programmers is that. The object might look different. When the robot approaches it from different angles. If we want to make a robot that automatically uses things. Like hand pumps, we must remember that the pump looks different from different angles. The robot must have an algorithm that allows it to recognize a pump.
So the robot can recognize the pump using fuzzy logic. In fuzzy logic, the system calculates. Percent of pixels. That matches the pump. The system can also have many photographs. These are taken from different angles. In that case, the system compares images and searches for matches of the pump. The problem is that the AI doesn’t know what a pump looks like when it's sideways to the observer. If the pump falls, it looks different than when it stands.
We know that AI is an ultimate tool in things like math. But another thing is that the system doesn’t create anything new. The system just connects. And reconnects information that it gets from the databases. This means that the thing that the AI misses is abstraction. AI can think very effectively if it has clear and precise orders. The information. That it uses. Must be handled. By following certain rules.
This is the reason why computers are ultimate calculators. Rules in math are clear. The system must follow a certain order of operations. For solving the problem. In math, the problem must be solved by following exact orders. Or the result cannot be accepted. Another thing. Is the input to the system. That the user gives. Are given from the keyboard. When the system must understand spoken language.
“As labor shortages push agriculture toward automation, harvesting delicate, clustered fruits like tomatoes remains a major challenge for robots. Researchers have now developed a system that allows robots to assess how easy a tomato is to pick before acting, using visual cues and probabilistic decision-making. Credit: SciTechDaily.com” (ScitechDaily, Robots That “Think Before They Pick” Could Transform Tomato Farming)
There are many problems in that case. Nonverbal communication plays a big role. In the communication between humans. Another thing is that. In spoken language. There are so-called hidden meanings for the words. When somebody collides with our car, we can say “how nice”, but that doesn’t mean that we think that is nice.
The main thing is this: the AI is like a dog. It is a very effective tool when it searches. And finds something. If we give an order to AI. To create a house, the AI uses models. Those models or variables are the bricks or Legos that the system can connect together. But another thing is this. The AI will not create any of those Legos. The AI requires variables that are already put in its databases.
This is why the AI is like a dog. It finds almost everything. But the lack of abstraction makes it impossible to create new things. The AI can “think” before it acts. Like the tomato-picking robot case, the AI compares models that tell how it should act. The AI recognizes a tomato. Then it searches its database. What. The type of movements it should make. And the system must know the compressive force, so that tomatoes will not turn into ketchup. Or if the compressive force in touch is too weak. Those tomatoes will drop on the ground.
https://scitechdaily.com/does-ai-really-know-anything-why-our-words-matter-more-than-we-think/
https://scitechdaily.com/robots-that-think-before-they-pick-could-transform-tomato-farming/


No comments:
Post a Comment
Note: Only a member of this blog may post a comment.