Pages

Thursday, January 13, 2011

Chip That Made the World Digital

Happy 40th Birthday to Intel's 4004, the Chip That Made the World Digital

Thanks for the PCs, Pads, Smartphones, Facebook, Google, Cyber Monday, MRIs and AR. Really

Happy Birthday, dear 4004.

When it was born, sometime in 1971, no one, not even his parents (all were Intel engineers), could have guessed he would come this far. Now, in 2011, we celebrate Mr. 4004's festive 40th birthday. In 1971, Model 4004 was the first commercially produced and sold central processing unit (CPU); the first general purpose microprocessor that allowed engineers to program its performance to their needs. Mr. 4004 was a wonder of its time; a tiny chip, the size of a fingernail, that contained no less than 23,000 transistors, and was faster and more powerful than the world's first electronic computer that was developed in 1946, filling an entire room.

Forty winters have passed, and although the story is well known, it is still incomprehensible. With the back wind of Moore's Law (which predicts that the number of transistors on a chip will double every two years), 4004 grew exponentially, and today, Intel's first line processors contain about 800 million transistors.

The even more astonishing figure is the total number of chips and embedded devices in the world. This figure also celebrates a birthday in 2011, and for the first time will cross the hundred billion figure. As there are about 7 billion humans in the world, it makes for 12 chips for each man and woman on the planet -- a decisive majority for the machines. The growth that led from the founding father, 4004, and resulted in a hundred billion chips, was also not linear. In fact, most of the descendants of 4004 joined us during the last decade, suggesting that in fact this growth is still at the "knee" of the curve, at the early stages of exponential growth.

They, the chips, are all around us: in almost two billion personal computers, like the one with which I write now. Dozens of them hide quietly in every modern car; they lurk in washing machines and toasters, nest in your children's toys, and wait patiently inside our vacuum cleaners. In contrast, they work very long hours inside the hundreds of thousands of Google servers spread around the world, in super computers chewing through thousands of Tera-flop calculations deep in the basements of universities and governments, and very soon they will appear inside our furniture and clothing, in our food, and eventually inside our bodies.
Major aspects of modern life are managed by computerized systems, with ever-declining human involvement and control.

Air traffic, the internet and the web -- with the inconceivable amount of information found in them. The energy infrastructures of countries, civilian and military management and decision systems, medical platforms, and the global financial infrastructure, all these and countless others are almost entirely dependent on the smart hardware and software combinations to keep them running, and less on the humans that initially designed them. Not sure? A few months ago the SEC installed a "kill-switch" aimed at limiting the ability of independent trade algorithms to buy and sell stocks and bonds.

I can't help but remember old HAL 9000, and predict that we will see many such stories in the coming years.

Beyond the boring old questions on who controls whom, the carbon or the silicon life form, it is fascinating to try and look at the cultural changes resulting from the acceleration of technology. Almost paradoxically, with the growth of computing power in our lives, technology is disappearing from sight. Since computing power is now part of almost everything we do, we take it for granted, and pay attention to it only when it fails -- similar to what happened with electrical power in the previous millennium. Almost overnight, computing has transformed from a scarcity-driven product to a state of abundance.

We take for granted the ever stronger PCs, the pads and the smart-phones, Facebook, Google, Black Friday e-commerce deals, internet on planes, MRI, machine translation, digital books and movies, augmented reality, printing and viewing in three dimensions, electric cars that park themselves, and even robots circling around our legs hunting for dust. In many ways, we now live in a fundamentally different human culture -- new even for the people who watched 4004 being born just 40 years ago. A brave new world indeed.

As marketing and communications professionals, we tend to believe we're catching up, and sometimes even ahead of the technology. After all, we all launched a Facebook campaign, a couple of flash sites and we even got an iPhone. I believe that the disruption created by the technology acceleration for our industry is at its early stages.

As 4004 and its offspring become ubiquitous and disappear from our line of sight, the importance of writing great lines will again be paramount for marketing and communications. They will not be copy or poetry lines, but lines of code. Our future lies in writing great software that delivers brands as media and services to consumers, and we still have a unique advantage in understanding what they (and we) will want.

As Brands are becoming media our clients will become publishers, and will shift to annuities -- currently called "owned media".

Ask yourself, can your agency ideate and build this kind of product for them, or will the job go to Google, or to a couple of kids in a garage near you?

The era of machines began only 40 years ago, and in many ways, with the acceleration and the convergence of technology around us, this year it completes its infancy, and begins to wonder about what to do when it grows up. Forty years from now, in 2051, the world will most probably be a much stranger place, even for us, geeky residents of 2011. Happy Birthday and New Year to 4004, and to all other life forms.

http://adage.com/digitalnext/post?article_id=148173

http://www.depsyl.com

http://back2basicnutrition.com/

http://bionutritionalresearch.olhblogspace.com/

No comments: