Blog
Article SearchA Foundational Overview of AI & Machine Learning
Artificial intelligence (AI) and machine learning (ML) are two exciting areas of technological development that are making headlines and becoming indispensable to businesses from all sectors. The AI market is expected to grow rapidly over the next decade as new uses for the technologies are being discovered and implemented at a speedy pace. One thing is for sure, AI and ML are here to stay and you should get familiar with them because if you’re not already using them, odds are, you will be soon.
AI vs. ML
First, it’s important to understand the differences between AI and ML ( they’re not exactly the same thing). AI is defined as the development of computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and more. So, then what’s ML? ML is an application of artificial intelligence (AI) that provides systems the ability to automatically learn and improve from experience without being explicitly programmed. Some of the tools used to develop machine learning include:
- genetic algorithms – models that utilize a survival-of-the-fittest approach to evaluate and refine the functions of artificial intelligence agents, and
- neural networks – complex, interconnected networks of artificial neurons called perceptrons that can interpret sensory data.
In this video, “A Foundational Overview of AI and ML”, Geisel Software engineer, Liz Couture, explains the differences between AI and ML, discusses foundational concepts, and demonstrates some of their most exciting applications.
Other articles in this section
FDA to Crack Down on Medical Device Security: Here's What You Need to Know
Introduction - IoT in Healthcare
Medical device security is a critical concern for patients, healthcare providers, and manufacturers alike.
Read
The Technological Singularity: Are We on the Brink?
Introduction
Each advancement in artificial intelligence (AI), machine learning (ML), and contemporary large language models (LLMs), rekindles debates over the fabled “technological singularity,” a hypothetical future point in time at wh
Read
Kristin Wattu