Read the 2nd Machine Age of our brief history of computer

AI Awakening Soundtrack to grab your attention and consciousness 

Listen to Man Machine an introduction to Machine Learning

Movie

The first series

debuted in

1966 on NBC

and ran for

three seasons. 

​Actor Leonard Nimoy recalled that when he obeyed a director's advice to "be detached" when speaking the line 'Fascinating', "a big chunk of the character was born right there". He liked Spock's logical nature, observing that the character is "struggling to maintain a Vulcan attitude [...] opposing what was fighting him internally, which was human emotion"

Read the 2nd Machine Age of our brief history of computer

Data Warehousing

| introduction to Machine Learning

Deep Learning Tutorial

| a Neural Network example

Recording

Kraftwerk's 1978

album, with its Russian

Constructivist / robotic

cover, inspired our

​soundtrack, especially

"The Robots" piece with its singing robot. 

This is the album for those who love the allure of electronic dance music on a dance floor crammed of fun-loving warm bodies being bathed in shimmering strobes and pulsating colored spotlights. Overall, the album has widely captured attention and consciousness to pass the test of time. 

Books

Andrew McAfee:

The Industrial Revolution

was when humans over-

came the limitations of

our muscle power. We’re

now in the early stages

of doing the same thing

to our mental capacity.

Erik Brynjolfsson: We call it the great paradox of the Second Machine Age. Even though we have record productivity, US median incomes are lower now than they were in the 1990s. There’s no law that everybody’s going to benefit from technology. 

You may also like:

Introduction to Machine Learning

A new SpareTag.com tutorial explaining how to make an Artificial Intelligence, from data warehousing to deep learning to attention and consciousness.

3 simple steps to make an Artificial Intelligence

Making of what is Artificial Intelligence


All visuals were selected, edited, animated and colored by SpareTag.com to form the following sequences of our original 90-second-short video:


  1. Welcome to Deep Learning Tutorial: A celebration of the internet community, for their knowledge and skills sharing through millions of home-made videos. Cheers! You are life savers to all of us.
  2. Data. Lots of data: Visuals inspired by the Matrix movie:
    "-What do you need beside a miracle?
    -Guns. Lots of guns".  Hopefully, data will be more peaceful!
    By the way, the data warehousing cabinets belong to the FBI...
  3. Filtering TV shows through Neural Network example: The idea of feeding artificial life into fictional characters, like Mr Spock, comes from "Tomorrow is Waiting," a must-read story from Holli Mintzer
  4. Paying Attention: We look at a lot of research to create this post. Can't acknowledge all the contributors but wish to refer Michael Graziano for his Attention and Consciousness theory.
  5. AI graffiti: our best Artificial Intelligence super hero (or villain, depending on your perspective) comes from street art made in Valladolid, Spain.  It is a great representation of power (or threat) with the backing from our friends (or foes) of the Silicon Valley
    -- we added the dollar bill cape for fun (or provocation)

The word "computer" was first use in 1613 by Richard Braithwait, an English writer, to describe a person, a highly skilled mathematician, able to perform impressive calculations. This is not without irony that, in the brief history of computer, fast computing machines may become intelligent virtual persons able to operate independently with their own attention and consciousness.


A brief history of computer


The first concept of a computer machine emerged in the 19th century.  It was the Analytical Engine, conceived – but never built – by English engineer Charles Babbage (1791–1871). The design had (i) an arithmetical logic unit (the "mill", now called ALU) to perform all four arithmetic operations and comparison, (ii) a control unit to interpret instructions (from “punched cards”, now called programs) allowing conditional branching and loops, and (iii) the ability to memorize 1,000 numbers of 40 digits each using physical wheels (the “store,” now called RAM). 

Still, it took another century before Alan Turing laid out in his 1936 paper: On Computable Numbers the same key principles for machines to perform detailed computations from memorized sets of instructions.  The need for machine to help decipher encoded messages during WWII was instrumental to the development of the technology that resulted in 1946 in the first general–purpose digital computer: the Electronic Numerical Integrator and Computer (ENIAC). It is said that lights dim in Philadelphia when it was turned on for the first time.

From there, the technology improved exponentially, each generation building faster and faster on the previous achievements, starting with the first commercial use in 1951 (Universal Automatic Computer, or UNIVAC 1); replacement of transistors by vacuum tubes in 1955; invention of integrated circuits in late 50’s; Intel’s first single-chip microprocessor in 1971. Then came IBM’s personal computer (PC) for home and office use in the 80’s, running the new Microsoft’s MS-Dos Operating System, subsequently replaced by Windows in the 90’s.

There doesn’t seem to be an end to the technological progress during our brief history of computer, leading unavoidably to the best Artificial Intelligence.  It is just a question of time, i.e. is AI centuries or decades away? Like hardware, applications of artificial intelligence will evolve from successive waves of innovations.  But what is artificial intelligence? Here is our introduction to machine learning.


Data Warehousing to gain Knowledge


Today, computers have become ubiquitous in all areas of life, at work or at home. This is because computers are useful tools that are much better than most humans involved in information-processing jobs (i.e. 65 percent of the American workforce).

In addition to surpassing the human mental capacity for calculation and processing tasks, computers have made possible the global interconnection of people and information sharing through the internet.  Combined with data warehousing capability, this has brought universal knowledge to our connected doors.

Since the development of powerful computers and cheap / fast access memory for in the late 1970s and early 1980s, machines have been instrumental in collecting, cleaning-up, classifying and data warehousing to gain knowledge.  The logical next step is the introduction to machine learning with computers able to draw lessons from the stored data. 


Predictive Deep Learning Tutorial


What is artificial intelligence today? Computers are currently able to recognize behaviors and patterns in databases to predict outcome.  For example, many tech companies are developing solutions to assess which customers are most likely to buy specific products. This information is then used to decide the best marketing / distribution channels.  Without the computers’ analytics, decisions are made based on expert judgment, which are biased toward status quo, i.e. toward the outcome most favorable to the expert interests, an outcome usually worse than relying purely on data.

Following the human brain neural network example, computers can already apprehend facts by themselves. In practice, data are filtered through successive layers of processing and grouping to, at the end, extract the essential characteristics of the original cloud of points. All concepts are learned from scratch, by analyzing and classifying the relations in the non-structured data, without supervision or specific programs. This is done with a new generation of graphics processing units (GPU), derived from video games, and particularly suited to recognize patterns in large volume of heterogeneous data. Interestingly, computers achieve 98% success rate in image recognition, whereas humans are wrong in 5% of the cases.

Soon, deep learning tutorial will allow a machine to understand a language and get not only the general sense but also the context, irony, metaphors, jokes, even intonation and silence. At first, the sentences will not be comprehensible but after many comparisons and reclassifications of words and group of words, a true meaning will emerge.

For the moment, machines can only identify patterns but lack the tools to put things in perspective, prioritize and recommend a course of action.  To be able to make decisions, a computer would need to be aware of the context, to have attention and consciousness.


Learning Attention and Consciousness


Somehow, the brain also processes information but in a more subjective manner, through memories, senses and emotions, which provides context and awareness. To gain this level of attention and consciousness, a machine would need to dispose of:

- surrounding “object" blueprints, i.e. generic vector representations of objects or fact patterns relevant to the situation.  Through the deep learning technology (see above), today’s computers are able to build some of these representation by themselves.

- its own computer “body" blueprint, i.e. its virtual self-representation, including characteristics of its physical shape, personality traits (behavior), and past performance (memories). This should be carefully done to prevent any future devious behaviors, like killing all humans, feared by many.

- an “attention scheme,” i.e. a description of the complex relationship between the “body” and the “objects”, some kind of practical map providing information about where the machine is, where it wants to go and the various paths to get there. The attention scheme allows the machine to concentrate its resources and focus on objects and tasks.

Without an attention scheme, the machine is not able to explain how it relates to anything in the world. With one, the machine is able to affirm its awareness of the object by reference to the defined scheme … the same way a human has attention and consciousness, without being able to explain where it comes from.


By far, the greatest danger of Artificial Intelligence is that people conclude too early that they understand it" Eliezer Yudkowsky

Alan Turing    Best Artificial Intelligence quote

Watch Star Trek starring our best Artificial Intelligence subject

3 simple steps to make an Artificial Intelligence
Listen to Man Machine | an introduction to Machine Learning

Attention and Consciousness

"But if the technological Singularity can happen, it will" Vernor Vinge

Watch Star Treck starring our best Artificial Intelligence subject