What Does Artificial Intelligence (AI) Mean?

Artificial intelligence (AI), also known as machine intelligence, is a branch of computer science that focuses on developing and managing technology that can learn to make decisions and carry out actions autonomously on behalf of humans.

AI is not a single technology. Instead, it is an umbrella term that includes any type of software or hardware component that supports machine learning (ML), computer vision (CV), natural language understanding (NLU), natural language generation, natural language processing (NLP) and robotics.

Today’s AI uses conventional CMOS hardware and the same basic algorithmic functions that drive traditional software. Future generations of AI are expected to inspire new types of brain-inspired circuits and architectures that can make data-driven decisions faster and more accurately than a human being can.

AI Use Cases in Business

AI is currently being applied to a range of functions both in the lab and in commercial/consumer settings, including the following technologies:

•Speech Recognition allows an intelligent system to convert human speech into text or code.

•Natural Language Processing enables conversational interaction between humans and computers.

•Computer Vision allows a machine to scan an image and use comparative analysis to identify objects in the image.

•Machine learning focuses on building algorithmic models that can identify patterns and relationships in data.

•Expert systems gain knowledge about a specific subject and can solve problems as accurately as a human expert on this subject.

At its heart, AI uses the same basic algorithmic functions that drive traditional software but applies them in a different way. Perhaps the most revolutionary aspect of AI is that it allows software to rewrite itself as it adapts to its environment.

Techopedia Explains Artificial Intelligence (AI)

While AI often invokes images of the sentient computer overlord of science fiction, the current reality is far different.

What are the types of AI and how do they differ?

Ai is often spoken about in terms of being either weak or strong. Today, most business applications of AI are machine-learning applications of weak AI.

•Narrow (Weak) AI is capable of performing only a limited set of predetermined functions.

•General (Strong) AI is said to equal the human mind’s ability to function autonomously according to a wide set of stimuli.

•Super AI is expected one day to exceed human intelligence (and conceivably take over the world).

AI initiatives are also talked about in terms of their belonging to one of four categories:

1. Reactive AI relies on real-time data to make decisions.

2. Limited Memory AI relies on stored data to make decisions.

3. Theory of Mind AI can consider subjective elements such as user intent when making decisions.

4. Self-Aware AI possesses a human-like consciousness that is capable of independently setting goals and using data to decide the best way to achieve an objective.

A good way to visualize these distinctions is to imagine AI as a professional poker player. A reactive player bases all decisions on the current hand in play, while a limited memory player will consider their own and other player’s past decisions.

A Theory of Mind player factors in other player's behavioral cues and finally, a self-aware professional AI player stops to consider if playing poker to make a living is really the best use of their time and effort.

The Evolving Stages of Artificial Intelligence

Artificial intelligence can be allowed to replace a whole system, making all decisions end-to-end, or it can be used to enhance a specific process. A standard warehouse management system, for example, can show the current levels of various products, while an intelligent one could identify shortages, analyze the cause and its effect on the overall supply chain and even take steps to correct it.

The demand for faster, more energy-efficient information processing is growing exponentially as AI becomes more prevalent in business applications. Conventional digital processing hardware cannot keep up with this demand. That is why researchers are taking inspiration from the brain and considering alternative architectures in which networks of artificial neurons and synapses process information with high speed and adaptive learning capabilities in an energy-efficient, scalable manner.

‍

Posted 
Jan 15, 2023
 in 
IT & Software
 category

More from 

IT & Software

 category

View All

Join Our Newsletter and Get the Latest
Posts to Your Inbox

No spam ever. Read our Privacy Policy
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.