Page 145 - Computer - 7
P. 145

It might be tempting to think of ML as doing all the hard work, but the ML algorithm can only follow
          the precise instructions of its creator. This section highlights the art of the AI engineer. They harness the
          power of concepts from a range of disciplines – most notably computing, logic, statistics, and calculus –

          while balancing a range of considerations about the problem itself and the context of its solution.

          FACTORS ENCOURAGING ML

          The factors that make Machine Learning AI the more relevant approach towards AI are:

          Massive Datasets

          Machine Learning algorithms tend to require large quantities of training data in order to produce high
          performance AI models. For example, some facial recognition AI systems can now routinely outperform
          humans, but to do so requires tens of thousands or millions of labeled images of faces for training data.

          When Machine Learning was first developed decades ago, there were very few applications where
          sufficiently large training data was available to build high performance systems. Today, an enormous
          number of computers and digital devices and sensors are connected to the internet, where they are
          constantly producing and storing large volumes of data, whether in the form of text, numbers, images,
          audio, or other sensor data files. Of course, more data only helps if the data is relevant to your desired
          application. In general, training data needs to match the real-world operational data very, very closely
          to train a high-performing AI model.


























          Increased Computing Power
          To  a  much  greater  extent  than  Symbolic  AI  Systems,  Machine  Learning  AI  systems  require  a  lot  of

          computing power to process and store the large volumes of datasets. During the early 21st century,
          computing hardware started getting powerful enough and cheap enough that it was possible to run
          Machine Learning algorithms on massive datasets using commodity hardware.
          One especially important turning point around 2010 was developing effective methods for running

          Machine Learning algorithms on Graphics Processing Units (GPUs) rather than on the Central Processing
          Units (CPUs) that handle most computing workloads. Originally designed for video games and computer



                                                                                                             143
   140   141   142   143   144   145   146   147   148   149   150