element14 Community
element14 Community
    Register Log In
  • Site
  • Search
  • Log In Register
  • Community Hub
    Community Hub
    • What's New on element14
    • Feedback and Support
    • Benefits of Membership
    • Personal Blogs
    • Members Area
    • Achievement Levels
  • Learn
    Learn
    • Ask an Expert
    • eBooks
    • element14 presents
    • Learning Center
    • Tech Spotlight
    • STEM Academy
    • Webinars, Training and Events
    • Learning Groups
  • Technologies
    Technologies
    • 3D Printing
    • FPGA
    • Industrial Automation
    • Internet of Things
    • Power & Energy
    • Sensors
    • Technology Groups
  • Challenges & Projects
    Challenges & Projects
    • Design Challenges
    • element14 presents Projects
    • Project14
    • Arduino Projects
    • Raspberry Pi Projects
    • Project Groups
  • Products
    Products
    • Arduino
    • Avnet & Tria Boards Community
    • Dev Tools
    • Manufacturers
    • Multicomp Pro
    • Product Groups
    • Raspberry Pi
    • RoadTests & Reviews
  • About Us
  • Store
    Store
    • Visit Your Store
    • Choose another store...
      • Europe
      •  Austria (German)
      •  Belgium (Dutch, French)
      •  Bulgaria (Bulgarian)
      •  Czech Republic (Czech)
      •  Denmark (Danish)
      •  Estonia (Estonian)
      •  Finland (Finnish)
      •  France (French)
      •  Germany (German)
      •  Hungary (Hungarian)
      •  Ireland
      •  Israel
      •  Italy (Italian)
      •  Latvia (Latvian)
      •  
      •  Lithuania (Lithuanian)
      •  Netherlands (Dutch)
      •  Norway (Norwegian)
      •  Poland (Polish)
      •  Portugal (Portuguese)
      •  Romania (Romanian)
      •  Russia (Russian)
      •  Slovakia (Slovak)
      •  Slovenia (Slovenian)
      •  Spain (Spanish)
      •  Sweden (Swedish)
      •  Switzerland(German, French)
      •  Turkey (Turkish)
      •  United Kingdom
      • Asia Pacific
      •  Australia
      •  China
      •  Hong Kong
      •  India
      • Japan
      •  Korea (Korean)
      •  Malaysia
      •  New Zealand
      •  Philippines
      •  Singapore
      •  Taiwan
      •  Thailand (Thai)
      • Vietnam
      • Americas
      •  Brazil (Portuguese)
      •  Canada
      •  Mexico (Spanish)
      •  United States
      Can't find the country/region you're looking for? Visit our export site or find a local distributor.
  • Translate
  • Profile
  • Settings
Tech Connection
  • Learn
  • Learning Center
  • Tech Connection
  • More
  • Cancel
Tech Connection
Documents Tech Spotlight: What is Artificial Intelligence and Machine Learning?
  • Blog
  • Forum
  • Documents
  • Files
  • Members
  • Mentions
  • Sub-Groups
  • Tags
  • More
  • Cancel
  • New
Join Tech Connection to participate - click to join for free!
Actions
  • Share
  • More
  • Cancel
Engagement
  • Author Author: rscasny
  • Date Created: 1 Aug 2018 6:45 PM Date Created
  • Last Updated Last Updated: 18 Feb 2022 8:41 PM
  • Views 3051 views
  • Likes 11 likes
  • Comments 5 comments
Related
Recommended

Tech Spotlight: What is Artificial Intelligence and Machine Learning?

image sponsor by    image

Artificial Intelligence is one of the hottest topics across many industry sectors, from autonomous driving to IOT / IIOT, robotics, and big data analysis. Indeed, the management consultants McKinsey & Company estimate the market for AI will grow between 15 and 25% a year, and is projected to be worth $130 billion by 2025.

 

Why is AI experiencing such rapid growth? One of the main reasons is that AI provides systems with the ability to do tasks which have been traditionally achieved by humans. However, when an AI solution is implemented, it is typically significantly faster and has greater accuracy. Most implementations of AI are focused upon one task, or a set of tightly constrained tasks, for instance medical diagnosis or autonomous vehicle operation.

 

What is the difference between Artificial Intelligence and Machine Learning?

What AI does not define is how the intelligence is implemented. One approach would be to follow a traditional software engineering approach where functionality is specifically coded. However, for complex tasks this could take significant time and becomes inefficient.

 

This is where Machine Learning comes in. Machine Learning is a subset of AI, where in place of explicitly implementing the behaviour required, the ML algorithm is trained with a dataset to create an analytical model which enables the prediction within an acceptable tolerance. In ML implementations, this algorithm is based upon an Artificial Neural Network.

 

What are Artificial Neural Networks?

Artificial Neural Networks is a term used to cover a variety of neural network configurations. However, at the highest level, an Artificial Neural Network can be modelled on the human cerebral cortex in that each neuron receives an input, processes it, and communicates the processed signal to another neuron.

 

Therefore, Artificial Neural Networks typically consist of multiple layers which can be described as containing an input layer, internal layer(s), and an output layer.

image

Figure 1: Simple Neural Network

 

There are several classes of Artificial Neural networks. Those which pass the output of one layer to another without feedback are called Feedforward Neural Networks (FNN), while those which contain feedback, for example an Elman network, are called Recurrent Neural Networks (RNN). 

 

One very commonly used term in machine learning is Deep Neural Networks (DNN). These are neural networks which have several hidden layers enabling a more complex machine learning task to be implemented.

 

An increasingly popular Machine Learning application is implementing Convolutional Neural Networks. Convolutional Neural Networks are used when the input is two dimensional, as is the case for embedded vision applications.

 

To implement a CNN, there exist several different structures which have evolved over time from AlexNet to GoogleNet, SDD and FCN. However, they are formed of the same basic functional blocks, although with different parameterisations. These stages are Convolution, Rectified Linear Unit (reLU), Max Pooling, and Fully Connected. When we train a CNN, we define parameters for the Convolution (filter weights) and Fully Connected layers (weights and bias), while the Max Pool and reLU elements require no weights or biases, although Max Pool requires parameterisation to define the size of the filter kernel and the stride.

 

How do we implement a machine learning solution?

To develop machine learning applications, there is a range of open source and free software we can use, from Caffe to Tensorflow and Torch/PyToch. These software tools allow us to define, train, and implement our machine learning application, without the need to start from scratch.

 

For edge-based applications, the challenges include power efficiency, security, real-time response, and determinism along with high performance.  The implementation of floating point weights and biases results in a solution which is not optimal in performance, determinism or power dissipation. A class of neural networks called Binary Neural Networks is increasingly popular for edge-based deployments. These applications use binary variables for weights and biases.

 

Of course, before we can deploy a neural network we first need to train the network to determine the value of the weights and biases used within each layer. Training introduces an additional layer to the network, which implements a loss function. This loss function enables the training algorithm to determine if the network correctly identified the input.

 

When we are developing CNNs we require large image sets to train the network. Two potential sources of such images sets are www.image-net.org and http://bdd-data.berkeley.edu.

 

To apply and work with the large data/image sets and calculate the biases and weights as quickly and as efficiently as possible, large farms of Graphical Processing Units (GPUs) are often used.

 

GPU farms are ideal for training, as the goal of training is to generate the weights and biases within the minimal time frame. Therefore, power efficiency, real-time response, and determinism for each input is not as critical as it is within the actual deployment.

 

Therefore, GPU farms provide the most efficient mechanism for determining the neural network weights and biases.

 

What are the differences between Edge and Cloud Deployments?

For deployment of a machine learning application the technology depends upon its use cases and requirements.

 

Edge-based implementations of machine learning are used when connectivity to the cloud cannot always be guaranteed and the application requires a critical response time in its decision loop, for example an autonomous vehicle. Edge based applications also bring requirements for any-to-any interfacing, security, and power efficiency along with the need for high performance.

 

Designers of edge-based applications are increasingly using heterogeneous System on Chips like the Xilinx Zynq-7000 SoC and Zynq UltraScale+ MPSoC which combine Programmable Logic (PL) fabric with high performance ARM cores in the Processing System (PS). This combination of PL and PS allows the creation of a machine learning inference engine which demonstrates improved determinism and response time, is very flexible for future modification, and offers a power efficient and secure solution.

image

Figure 2: Road Sign detection, Binary Neural Network running on an Ultra 96 (Zynq MPSoC)

 

When it comes to cloud-based Machine learning deployments, the application is still compute-intensive and could still benefit from acceleration. Cloud-based machine learning can work with large volumes and a variety of data with large storage requirements. To accelerate machine learning applications in the cloud, hyperscale cloud companies such as Amazon Web Services and Baidu have introduced high performance FPGA devices into their server architecture, to enable acceleration.

 

There is no doubt that AI and ML will continue grow and be a major component of many systems for the foreseeable future. The proverbial iron is hot. Why not download one of the free software development tools and start working on your own AI/ML application?

  • tech spotlight
  • artificial intelligence
  • xilinx
  • ai
  • Share
  • History
  • More
  • Cancel
  • Sign in to reply

Top Comments

  • DAB
    DAB over 7 years ago +4
    I always object to the term "Artificial Intelligence" as it is highly inaccurate. I have followed this technology for nearly forty years and I have seen it applied to a wide range of problems with varying…
  • rscasny
    rscasny over 7 years ago in reply to DAB +3
    Hi Don. Interesting comment. You had an issue with the term "artificial intelligence." I think you preferred " enhanced statistical decision making." I'm not sure that phrase captures or helps use understand…
  • crash#1
    crash#1 over 6 years ago in reply to rscasny

    " I believe as we (meaning the overall science, engineering and technology community) becomes more proficiency in AI, the percent of error will become so small that's it's virually insignificant."

     

     

    I hope they are willing to take the risk involved, that small error can still be an outlying anomaly and with nobody at the helm. People will become complacent and go the computer is faster than a human. It must be correct and we don't have a lifetime or budget to confirm it. Thanks, but no thanks.

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • More
    • Cancel
  • rscasny
    rscasny over 7 years ago

    I am looking for some people who have AI /ML experience for some beta testing/reviews. If you are interested, please message me at rscasny.

     

    Randall

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • More
    • Cancel
  • rscasny
    rscasny over 7 years ago

    Fascinating article about space travel and AI in this weekend's Financial Times:

    https://app.ft.com/content/1d9a3be8-b5e3-11e8-bbc3-ccd7de085ffe?sectionid=home

     

    As much as we would like to think human beings (organic) will travel to planets unknown, radiation in open space would probably kill them before they got there.

     

    We have to rely on machines to do the space travel, as they have done quite successfully since Neil and Buzz landed on our moon in 1969.

     

    But we need smarter machines and  AI will do it.

     

    The above article is a good review of the space program from its beginnings.

     

    This is just a fascinating area and I hope I can gather interest on element14 to delve into it more deeply.

     

    Randa;l;

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • More
    • Cancel
  • rscasny
    rscasny over 7 years ago in reply to DAB

    Hi Don.

     

    Interesting comment. You had an issue with the term "artificial intelligence." I think you preferred "enhanced statistical decision making." I'm not sure that phrase captures or helps use understand what we accept is meant by the term AI. For me, this sentence by the author helped me understand AI (and machine learning as part of AI) best of all: "Machine Learning is a subset of AI, where in place of explicitly implementing the behaviour required, the ML algorithm is trained with a dataset to create an analytical model which enables the prediction within an acceptable tolerance."

     

    The second thing I found interesting in what you said was regarding....the future:

     

    "I still shiver when I think about the number of people who initiate actions based upon the results of these technologies. Why? Simple, you CANNOT verify either the data or the algorithm that came up with the answer. That means you are taking action on the blind faith that the data and the algorithm are correct. Which is statistically impossible."

     

    I don't agree with your blind faith, yet I can appreciate you concern over verification and accuracy. I believe as we (meaning the overall science, engineering and technology community) becomes more proficiency in AI, the percent of error will become so small that's it's virually in significant.

     

    I'm wondering if the deeper concern her is not technical but cultural and one of technology adoption. The fact is that machines will be taking over more activities that are or once were in the purview of human beings. Like anything, it needs time to mature and accfept and adopt so it's a good thing.

     

    I'd be curious on what other members have to say about your conversation here.

     

    Randall

     

     

    • Cancel
    • Vote Up +3 Vote Down
    • Sign in to reply
    • More
    • Cancel
  • DAB
    DAB over 7 years ago

    I always object to the term "Artificial Intelligence" as it is highly inaccurate.

     

    I have followed this technology for nearly forty years and I have seen it applied to a wide range of problems with varying degrees of success, but mostly failure.

     

    In all cases, you have to be very careful about the types of applications in which you apply these algorithms.

     

    A more accurate description would be "enhanced statistical decision making" instead of "AI". Yes it is artificial, but no it is not intelligence. At best you are just screening a lot of data trying to find correlations where the data appears to fit your questions.

     

    Adaptive algorithms, something for which I was part of a patent application team, involves letting the data build specific correlations, usually over time or space. Again, you have to clearly define your data space and your questions before you get any value in return.

     

    In either case, all that has changed over the last forty years is that the computers have gotten tremendously faster and the algorithms have gotten more efficient as to how you frame your questions and sort your data space.

     

    There is no magic here. The applications still rely on brute force comparisons of real time data against training sets.

     

    I still shiver when I think about the number of people who initiate actions based upon the results of these technologies. Why? Simple, you CANNOT verify either the data or the algorithm that came up with the answer. That means you are taking action on the blind faith that the data and the algorithm are correct. Which is statistically impossible.

     

    So you can use this technology, but be aware that there is no way to verify the results nor validate that you are making a good decision based upon the output of the technology.

     

    It is a total coin flip. 50% of the time, you will get a good or good enough result, but 50% of the time, you will get a bad or really bad result. I prefer a bit more determinism in my decisions.

     

    DAB

    • Cancel
    • Vote Up +4 Vote Down
    • Sign in to reply
    • More
    • Cancel
element14 Community

element14 is the first online community specifically for engineers. Connect with your peers and get expert answers to your questions.

  • Members
  • Learn
  • Technologies
  • Challenges & Projects
  • Products
  • Store
  • About Us
  • Feedback & Support
  • FAQs
  • Terms of Use
  • Privacy Policy
  • Legal and Copyright Notices
  • Sitemap
  • Cookies

An Avnet Company © 2025 Premier Farnell Limited. All Rights Reserved.

Premier Farnell Ltd, registered in England and Wales (no 00876412), registered office: Farnell House, Forge Lane, Leeds LS12 2NE.

ICP 备案号 10220084.

Follow element14

  • X
  • Facebook
  • linkedin
  • YouTube