Tuesday, 15 September 2009

Crop Circle Day on Google

Its Crop Circle day on Google 15 September 2009. Why Not, there pretty and mysterious, and so this is the girl from hipper liner. I run a crop circle list myself. So does google like me? Where is my list entry, Crop Circles @ Feed Distiller.com, secret. Shout it from the roof tops baby. Guess what most of the top pages are Virals. Crystal Links was typical, links on crop circles and a spooky infovermercial bleep virt. Beamed me to mean me. You pay for the link by paypal, and the love and attention that went into the graphics was UGLY. I've not master of graphic design either. And graphic Designers have turned me down, when I was offering cash. Lazy Gits. Crystal link could get a link to from my system for the price of reading it. If that has a name its probably "open viral". So...

I'm also interested in the mechanics of the ecology of Viral themes. Do you like being called Viral? Its just descriptive here. Lazy people people want free money from google, and will pay a lot to get it. What traps do those playing P.J.Barham fall into here. The laws of the arena are not clear to me. If Google and Microsoft Bing, want to, they could wipe me out a second. Where's the L in the logo. Do microsoft or google, want the job of managing the ecology. Probably No, its to much thinking. Plus all the politians and pressure group will want to interfere in it. God help us all, if widow twanky starts making the laws. Google and Microsoft make the algorithms, to manage the ecology. Gaming the ecology, matters. Money grows on trees. Sometimes decision trees. But Societies has to work fairly fairly. I've lost the lottery, now what, sofa surfing?. Are the skills going in the tea shops and pubs you'll retreat to. Porcelain china, and Do a lot of work for charity. Graphic designers are safe, especial if they look like models. Another cold hard day for the morlocks.

How do you measure the productivity of a web site, for its money, and for how good it is at doing its reason for existance. As judged by the reviews I guess, is looking deeper a mind trap? But if you don't get the attention of the reviewer that doesn't happen. Google choose crop circles today, which was lucky for crystal links. So it began. And what was the top page on google: The Daily Telegraph. News Matters.

Wednesday, 9 September 2009

The Chip of Future, AD 2020

Since I'm blogging here about the future of computing, i'd thought i'd have a look today, at what the next ten years will bring to the CPU, the processor chip at the heart of all modern computers. Futurology is often a difficult subject but when it come
to silicon the future has been pretty well mapped out by maths. Back in the 1960s when the silicon chip was first invented Gordon Moore of Intel, predicted the following.

  • The number of transistors on a chip will double every two years.
  • Or equivalently the width of a transistor will shrink by 1.414 every two years.
  • (From the speed of light), the frequency will increase by 1.414 every two years.

Taking these a face value, and starting with 2010s model CPU, 8 cores each running at 4GHz, 12MB of cache. We can extrapolate tens years, five die shrinks of 1.414 time smaller a piece. Lets make that six, to make the math easier, so where really looking at the chip of 2022. Here what we get:

  • 512 Cores on a chip each running at 32GHz
  • 512MB of cache
  • 35 TeraFlops or 35 Million Million Floating Point Math Operation per Second
  • RAM modules will be 128 GB and run at about 10GHz
  • Flash Memory will around 1TB module size
  • Hard Disks? Might well not keep place, might only have 10s of TBs

Are we really going to get a chip like that? In particular does the average user need 500 way multicore chips. The Multicore trend only started in the 2004 because chip designer found it was the most efficient way to increase the power of there processors. If it is to continue, software needs to change in order to use many processors at the same time, this will be a big change in programming style, it will
require new Operating Systems that can intelligently schedule big tasks to many processors while keeping small regular tasks easierly started on other spare processors. I'm sure it can be done, but will it? If not manufacturers will find it difficult to sell such massively parallel many core chips.

The ATOM version

If multi-cores don't catch on, then its more likely manufacturers will opt to use the space on silicon chip to build complete computers on a chip, with I/O, Graphics, Memory all integrated into a single chips, continue the trend started with AMD Fusion chips (graphics processor and cpu on the same chip). For gamers a big GPU and a few cores of CPU, perhaps with a dedicated physics unit, might be the prefer options.

This might be the final chip too

Our chip of 2022, is made at 4nm lithography (6 shrinks from the 32nm Intel and AMD will be using in 2010), this is about the limits as to how small we can design chips. In a Silicon Crystal atoms of silicon are spaced about 0.5nm apart. Our 4nm chip has wires just 8 atoms thick. Quantum tunnelling between adjacent wires becomes a problem already at 16nm (scheduled for 2013-2016). Once we can't shrink processors any further we can only make more powerful computers by increasing the size of chip itself or by building vertically, but either way we would have to pay more for building either. Once we stop shrinking chips, the price for each transistor stops going down. So it really looks that the end point of conventional silicon chips is reached in the 2020s.

The revolution is Over and still no AI

Our computer of 2020, is still at bit short of a human brain, which is estimated to need about 10^16 Computations per second, and 10TB of memory to emulate. We're about 300 times to slow to match a human brain, and 100 times to little memory. To get there we need another 6 dies shrinks, but one atom thick wires just aren't possible. Having said that a 2020s supercomputer made with hundreds of processors is there, so big organisation, can start playing god with brain rate supercomputers if the're so funded. In fact through, for modelling the brain, we've very much got the wrong architecture above.The brain is made of 100 billion slow but parallel operating neurons connected by 100 trillion interconnections. To emulate really we want chips
which are vastly more parallel, integrated with a rewire-able transport system able to connect any segment to any other in real time. Such a chip might only be made once AI or mind-uploading is already standard.

Conclusions: The singularity is delayed

Once we reach the end of Moore's Law, I'm sure innovation will continue in computing, certainly in software, but hardware will have stalled, improving only gradually from then on. It might be a long wait before the next revolution, Nanotech, starts. Nanotech rewrites the rule on how to build computers (and everything else), to start it will require 3 dimensional atom by atom construction of self replicating robots, thats not any easy task, nor is it easy to design, Nanotech might come anywhere next century, or never. Some futurologist would like to start Nanotech in the 2020s just to keep Moore's Law working, but Moore's Law is just a line drawn in a existing trends. Hard work makes it continue to be true, but hard work, can't get past the limit of the laws of physics. The computer of much of the current century may have specs like I've describe above, and it will take new revolutionary technology to better it.