Page 106 - Code & Click - 8
P. 106
11 Future of Artificial Intelligence
Pre-Processing
Pre-Processing
• Big Data • Data Mining
• Neural Networks • Deep Learning
• Data Science and Data Scientists • Future Domains of AI
Artificial Intelligence is aimed at imparting to machines the capability to mimic
human behaviour and display human-like intelligence. Artificial Intelligence follows
several approaches to accomplish this task. Let us now learn about some more
techniques and approaches that will define the future of Artificial Intelligence and
its applications.
BIG DATA
Big data refers to data sets that are too large or complex to be dealt with by traditional data-processing
application software. Big data is a combination of structured, semi-structured, and unstructured data.
Companies use big data in their systems to improve operations, provide better
customer service, create personalised marketing campaigns, and take other
actions that, ultimately, can increase revenue and profits. Businesses that use
it effectively hold a potential competitive advantage over those that don’t
because they’re able to make faster and more informed business decisions.
Examples of Big Data
Big data comes from myriad sources. Some examples of sources of big data are transaction processing
systems, customer databases, documents, emails, medical records, internet clickstream logs, mobile
apps, and social networks.
In addition to data from internal systems, big data environments often incorporate external data
on consumers, financial markets, weather and traffic conditions, geographic information, scientific
research, and more.
Characteristics of Big Data
Big data was originally associated with three key concepts, called the three V’s: Volume, Variety, and
Velocity. Later on, data scientists added more characteristics to big data to ensure that the collected
data was relevant for storage and applications. Let us learn about these V’s.
1. Volume: A big data environment contains a large amount of data due to the nature of the
data being collected and stored.
Clickstreams, system logs, and stream-
processing systems are among the
sources that typically produce massive
volumes of data on an ongoing basis.
104