In 1965 a forward-thinking computer engineer named Gordon Moore, established the theory of Moore’s Law. Despite being one of the founders of Intel, not many people outside of the computer industry have heard of Moore or his theory, so what is Moore’s Law and why is it still relevant 50 years on?
During his early years, whilst he was busy helping establish the Intel empire, Moore worked tirelessly with the early editions of microchips. He understood how important microchips would be in shaping a modern world and predicted a negative and positive correlation between processing power and price. Moore postulated that every two years the number of transistors built in to every microchip would double, but the cost of chip-based technology would halve.
Moore didn’t set-out to create a law, he simply made the statement as he noticed emerging trends from technological advancements that were happening at Intel. The reason why it is still revered today is that Moore was right. In fact, he was more than right, since 1965 the number of transistors on microchips has doubled every 18 months. He also got the negative correlation of cost pretty much bang on, with the average cost of a computer in the 1960’s at around $1million, compared to today’s price of $630.
The Moore Effect
Now, there a lot of theories and so called ‘Laws’ out there that have held their own against the test of time, but haven’t really had an impact on wider society. Moore’s Law however was different. It provided emerging tech companies with the perfect formula to quantify output and profit. Using his theory, companies knew what productivity levels were needed to attain the desired profit margins and what cost per unit would provide them with a competitive edge in the market. This limited risk made the production process and selling process of creating chip-based technology much more efficient and accurate.
But, this wasn’t the only important effect Moore’s Law had on the industry. As many companies based their production and distribution scheduling around Moore’s prediction of 18-24 months, consumers started to buy new products at around about the same rate. Consumers, either consciously or subconsciously, expected a new product every two years. Have you ever seen a friend with a newer, more impressive iPhone than yours and felt jealous or judged? Well that feeling can be attributed to Moore’s Law. This not only allowed companies to predict when consumers would buy their products, it also gave them the power to influence the buying patterns of their markets.
The golden days are over
For 50 years the world enjoyed more powerful, faster, smarter and cheaper chip-based technology. Microchips gave birth to brand new industries and products, such as gaming, mobile phones and the internet and have helped saved millions of lives through providing vital advancements to the medical industry. But, those days could soon be over. The incessant desire to create smaller transistors and more powerful microchips has appeared to drain the pools of innovation. Companies have hit a brick wall and are now struggling to create smaller transistors.
There is a reason for this, the true scientific answer involves too many equations and odd-looking symbols, so we’ll give you the short version. Some of the worlds most advanced microchips today store around 4billion transistors in an area of 87mm-squared. This appears to be the absolute smallest a transistor can go without disrupting electrons. If made any smaller than they currently are, electrons begin to disrupt the connection, rendering the microchip useless.
So, where do we go from here?
Well, many companies have now dropped out of the microchip race. Instead of fighting an unwinnable war they have begun to look elsewhere and use innovation to take advantage of existing resources. For example, graphics processors, traditionally used to load video game graphics at lightning speed, are being tweaked to help drive advancements in data analysis and artificial intelligence.
Other companies are looking at how they can improve their current stock of microchips without the need of decreasing the size of transistors. Google have increased their storing capabilities and begun linking their existing microchips together, increasing the power of their circuits and increasing their processing power.
A world without microchips
People are already predicting a future where we no longer use microchips, and the technology is already here. Ironically, the solution could lie in the problem, quantum computing is quickly becoming the prized alternative to microchips. Traditional silicon-based microchips process information in a series of 0’s and 1’s, these are called bits. Whereas quantum computing uses qubits, which can process information in any value, which enables them to process more information quicker and use less energy. Google, IBM and Intel have all built their own quantum computers, it’s a matter of time before they put them into mass production.
Other solutions are constantly emerging and providing better results than the traditional microchip. Graphene for example uses transistors however is a far more conductible material than silicon, early experiments have shown graphene to process information at 1000x quicker than traditional microchips. Koniku, a technology company based in San Francisco, are using actual real-life neurons to process data, they hope one day to implant living neurons into chips.
With death there is always life
We are likely to witness the death of Moore’s Law in our lifetime. Microchips have become an exhausted resource, where innovation once thrived, now lies a stagnant industry. But, that isn’t a bad thing, without a problem you cannot find a solution. The microchip’s time may have passed, but innovation still thrives. As Moore suggested more than 50 years ago, we are now on a new horizon towards a better, smarter, faster future.