Will AI take over the world?

A Ydobon
4 min readOct 24, 2020
Photo by Markus Winkler on Unsplash

- Hey, do you know AI?
- Of course Yes! Such a great tech, huh. Everyone is talking about that one.
- I know right. Then what is AI?
- Ehh… something like robot?

Have you ever used Siri in last one month, or Bixby? Only just few years ago, they didn’t even exist(or insanely stupid). How about now? They can be sometimes our friend, assistant, or something to play with.

So who are they? They are the AI!

As AI has been dramatically competent, it is widely permeating into human life. But still there are great barriers to grasp it as a non-major human. So here this posting is for THE humans. Let’s take a ride to the world of AI real quick!

Even before we realize, it…

It hasn’t been so long since AI went on the rise holding the limelight from ordinary people. The attention-getter was AlphaGo developed by Google. It defeated the world champion in the board game Go in 2015 (Haenlein & Kaplan, 2019, p. 8). Surprisingly, its birth traces back to the early 1940s.
(Let’s skip tedious historical things)

After ups and downs, AI has risen brilliantly with huge presence.

Wait, it has types? How come?

Now it is time to break down AI to 3 pieces. Artificial intelligence, machine learning and deep learning are they. Machine learning and deep learning are subset of AI, and deep learning is subordinate to machine learning.

Artificial Intelligence

Artificial intelligence is defined as “a system’s ability to interpret external data correctly, to learn from such data, and to use those learnings to achieve specific goals and tasks through flexible adaptation” (p. 5). In a nutshell, AI tries “to develop human-like intelligence in machines” (Das et al., 2015, p. 31). It can be also described as a machine performing tasks to make the outcome indistinguishable from the same tasks done by humans (Floridi, 2020, p. 129). To achieve the goal, AI should work “as a growing resource of interactive, autonomous and often self-learning” (p. 129). This concept has led AI to the concept of machine learning.

Machine Learning

To make an accomplishment suggested above, machine learning had grown out enabling “the machines to gain human-like intelligence without explicit programming” (Das et al., 2015, p. 31). This field is directly related to what people are used to seeing AI in their daily lives, such as photo tagging, web search or email anti-spam (p. 31). Machine learning can learn via supervised learning, unsupervised learning and reinforced learning et cetera. The main difference between supervised learning and unsupervised learning is the existence of any expected output. Supervised learning compares computed output with expected output trying to adjust the error between the outputs. On the other hand, unsupervised learning has no expected output, thus it gives weight to input patterns. Meanwhile, reinforced learning focuses on output with regard to the learning agent’s reactions in order to “maximize some notion of long-term reward” (p. 32). With these more advanced ways, machine learning has been used in “many segments of industry” (p. 31), from our daily lives to highly technologically advanced businesses (p. 32–38).

Deep Learning

Deep learning, a subset of machine learning, uses neural networks that are inspired by neurons in human brains. Deep neural networks improve their performance by having multiple hidden layers between the input and output layer” (Scharre et al., 2018, p. 5). Learning processes of deep learning also can be roughly classified to supervised learning, unsupervised learning and reinforced learning. What is important is, those hidden layers can be learned from data without direct design of human engineers (LeCun, 2015, p. 436). Deep learning is regarded as a powerful method that is competent for solving different problems (Scharre et al., 2018, p. 9). Hence, with advancement in deep learning, AI can be adopted in solving problems applied into real life.

So in the future…

Before some of us know, AI apparently dominates human life. Deep learning will achieve more successes as“it requires very little engineering by hand”, thus being able to seize on many data (LeCun, 2015, p. 436). You’d better prepare not to be replaced by AI if you don’t want to receive unemployment benefits.

Can you imagine your day without smartphone? Sounds disgusting. But less than two decades ago, smartphones were just phones with more functions and some of them were even regarded as unnecessary. Now I guess you catch AI’s future.

It is already widely used in many ways “from image recognition to predicting medical outcomes” (Scharre et al., 2018, p. 6). Like AI will be a main technology in the future like smartphones that are indispensable nowadays. Before 10 years, smartphones were just phones with more functions and some of them were even regarded as unnecessary. However, we cannot imagine our daily lives without smartphones now. AI would be inevitable in the similar way swallowing almost all criteria of human life.

So are we gonna lose by that sneaky AI? Well, maybe or maybe not.
What I can only say is,

We are human, YES WE ARE!
Raise a toast to our yet-unknown-future!

References
-
Haenlein, M., & Kaplan, A. (2019). A brief history of artificial intelligence: On the past, present, and future of artificial intelligence. California Management Review, 61(4), 5–14. doi: 10.1177/0008125619864925
- Floridi, L. (2020) What the Near Future of Artificial Intelligence Could Be. In: Burr C., Milano S. (eds) The 2019 Yearbook of the Digital Ethics Lab. Digital Ethics Lab Yearbook. Springer, Cham
- Scharre, P., Horowitz, M., & Work, R. (2018). Artificial intelligence: What every policymaker needs to know (pp. 3–4, Rep.). Center for a New American Security. doi:10.2307/resrep20447.4
- Das, S., Dey, A., Pal, A., & Roy, N. (2015). Applications of artificial
intelligence in machine learning: Review and prospect. International Journal of Computer Applications, 115(9), 31–41. doi: 10.5120/20182–2402
- Lecun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436–444. doi: 10.1038/nature14539

--

--