Artificial Intelligence

Artificial Intelligence


What is Artificial Intelligence

Artificial Intelligence or AI is intelligence  demonstrated by machines, in contrast to the natural intelligence displayed by humans and animals. The term "Artificial Intelligence" is often used to describe machines ( or computers ) that mimic "cognitive" functions that humans associate with the human mind such as "learning" and "problem solving".


history of Artificial Intelligence

The idea of inanimate objects coming to life as intelligent beings has been around for a long time. The ancient Greeks had myths about robots, and Chinese and Egyptian engineers built automations.

The beginning of modern AI can be traced to classical philosophers' attempts to describe thinking as a symbolic system. But the field of AI wasn't formally founded until 1956, at a conference at Dartmouth College, in Hanover, New Hampshire, where the term " Artificial intelligence" was coined.

But achieving an artificially intelligent being wasn't so simple. After several reports criticizing progress in AI government funding and interest in the field dropped off - a period from 1974-80 that became known as the " AI winter". The field later revived in the 1980s when British government started funding it again in part to compete with efforts by the Japanese. 

The field experienced another major winter from 1987 to 1993, coinciding with the collapse of the market for some of the general-purpose computers and reduced government funding.

But research began to pick up again after that, and in 1997, IBM's Deep Blue became the first computer to beat a chess champion when it defeated Russian grandmaster Garry Kasparov. And in 2011, the the computer giant's question-answering system Watson won the quiz show "Jeopardy!" by beating reigning champions Brad Rutter and Ken Jennings.

Artificial Intelligence in the present 

Artificial Intelligence is used for a lot of things in the present. From e-mail spam filters to self driving cars, there are a lot of examples for AI. In 2015, Google made a 100% self driving car and it took blind passenger Steven Mahan on public roads in Austin, TX, USA.



And you could even find AI right in your own smartphone. Nearly every modern smartphone is equipped with an AI assistant like Google Assistant for Android and Siri for iOS.

In 2018 Moley Robotics created the very first robot cook. It works as well as a 5-star cook and it learns by recording 3D visuals actual cooks in action and repeats them.


Future of AI

The future of AI is very debatable. Some say it will make our lives much easier and efficient. And some say it will turn against us and take over humanity. But there are very good reasons for both arguments. AI could learn and think like humans do. And it's getting more and more intelligent than humans.

It's already automating manual and repetitive tasks. Soon it will augment human decisions. Along the way, it will add more to global GDP by 2030 than the current output of China and India combined. That growth will be more than enough to create many good jobs, while it will also change how current jobs are being done.

Artificial intelligence is impacting the future of virtually every industry and every human being. Artificial intelligence has acted as the main driver of emerging technologies like big data, robotics and IoT, and it will continue to act as a technological innovator for the foreseeable future.

In the post Covid-19 era, a technology that had till now been crawling — or at best, walking slowly — will now start sprinting. In fact, a paradigm shift in the economic relationship of mankind is going to be witnessed in the form of accelerated adoption of artificial intelligence technologies in the modes of production of goods and services. A fourth Industrial Revolution — as the AI-era is referred to — has already been experienced before the pandemic with the backward linkages of cloud computing, big data and 5G. However, the imperative of continued social distancing has made an AI-driven economic world order today’s reality.

Sources


- Inuka Batawala







Comments

Popular posts from this blog

Development and history of the ICT sector in Sri Lanka

All about FOSS and GitHub