Forgot your password?

Standing on the Shoulders of Giants

By Edited May 29, 2016 0 0

Standing on some lofty shoulders

Isaac Newton famously wrote in 1676, "If I have seen further it is by standing on the shoulders of giants."  It isn't surprising that Newton understood something extremely important on a fundamental level - after all, it was Sir Isaac himself who first understood that gravity works as a force to attract all objects in the universe to one another, along with many other universal truths and major concepts in physics.   However, Newton also understood that, without the help from Euclid, for example, there's simply no way he could have invented the calculus necessary to calculate the orbits of the planets, or even the idea behind his famous "inverse square" law - the idea that gravity loses its potency with proportion to the square of the distance away from an object.  This is because of geometry:  a sphere's surface area is essentially the shape of the gravitational field as it moves outward, and Euclid had already described how to calculate the surface area of this shape two thousand years earlier (lucky for Newton).

In much the same way, Albert Einstein would not have been able to make his earth-shattering discovery about special relativity, for instance, if it hadn't been for Lorentz, the Michelson Morely experiment, and a host of other recent discoveries (including James Clerk Maxwell's excellent descriptions of the electromagnetic field), and his incredible "general relativity" theory owed a debt of gratitude to Sir Isaac himself, whom Einstein ultimately toppled.  In fact, every single discovery in physics owes that same homage to the past "giants" who have come before, paving the way with infrastructural theories and concepts.   The same holds true for technological innovations, and understanding this can help us to see where we fit in with our species, and to see where we might be headed in the future.

Standing on this guy's shoulders

Giant shoulders
Credit: Wikimedia Commons

Tech acceleration and Moore's Law

The same concepts that underlie discoveries in physics (and, more broadly, in science in general) absolutely apply to technological innovation.  Here's where things get really interesting when we consider how the future is going to look.

Computer chip designs are created nowadays by computers, not by humans, as the designs are far more intricate than humans can conceptually grasp. As a result, faster, better computers are created with these new chips, which are then in turn capable of creating better chip designs, and so on (more on that in a bit).  You're probably familiar with Moore's Law, stated by Intel co-founder Gordon Moore all the way back in 1965:

"The complexity for minimum component costs has increased at a rate of roughly a factor of two per year. Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer."

At a time, the above prediction was anything but a "law", with scant data available for Moore to make this statement, but as the years wore on, his prediction was not only proved to be accurate for the short term, but here we are, half a century later, still steamrolling forward.  Half "law" and half self-fulfilling prophecy (the computer industry uses Moore's Law to project production, expenses, and earnings), the prediction and observation continues to hold true.  As more and more data becomes available, there are no chinks in Moore's armor - yet.  There will be, though, around 2020, due to quantum mechanical phenomena that are unavoidable due to the smallest size the wavelength of light can be.

Why it works

One of the main reasons why Moore's Law has continued unabated for half a century is that the chips themselves- the very products of the innovations- are facilitating the next generation of designs.  When I began my college career in 1993, I majored in engineering (a choice I soon regretted).  One of the major perks of the university I attended was access to a program called "Autocad", a relatively new "computer aided design" (CAD) program that facilitated designing plans.  Well, programs like this are used to design incredibly complicated computer chips nowadays.  As the number of transistors on a chip approaches and exceeds 10 billion and then 100 billion, there is simply no way a human mind can design and execute the placement of the individual transistors anywhere near as well as a program specifically designed to do so.  As such, the programs enabled by the chips created in the late 80s allowed new programs to be written in the early 90s, which could then be used to design the chips of the late 90s, which would then be used to create CAD programs of the early 2000s, and so on.  Today's CAD programs are immensely more capable than the programs I first used in the 1990s, and tomorrow's programs will make today's programs look like Pong looks to an avid gamer today.

Innovations across fields of science

One of the most exciting things that is happening with innovations in general, and with scientific discoveries, is that innovations across various fields of science are playing major roles in discoveries in other fields.  One of my favorite potential applications of this is DNA computing.  This may well be the next paradigm in chip design once Moore's Law runs its at-present relentless course, finally running into the immovable laws of quantum mechanics and the uncertainty principle (and the very finite size of the wavelength of the electron) around 2020. DNA itself was only recently able to be sequenced because of innovations in computing based on quantum mechanics, artificial intelligence, and a host of other fields, and our understanding of it has (in the past decade) taken a dramatic leap upward. DNA computing itself may well lead to far better AI, much faster and better computing, and much, much more.

Computer science has led to realistic simulations of the origin of the universe, helping to corroborate predictions of theories arising from not only cosmology, but also quantum mechanics.  Particle accelerators use supercomputers in order to process all of the information that arises from collisions of protons and electrons, from which new fundamental concepts that help aid in computer design arise, and so on, ad infinitum.  Better computer simulations of origins of the universe or particle collision analysis are bound to arise from greater computing power that has arisen from a better understanding of the laws of nature, and so on.

Credit: Wikimedia Commons

Portable, artificial lighting - one specific example

All of this is culminating in gadgets we use every single day, often without thinking about using them.  Consider briefly the flashlight app you have on your smartphone.  Now it's very likely that your flashlight app uses LED technology in order to work with a high intensity, but without draining your battery almost right away.  LEDs, or Light Emitting Diodes, arose because of an understanding of quantum mechanical principles, including "electron holes" that are filled when the proper voltage is applied, releasing photons very, very efficiently.  All well and good, but to really understand and appreciate this fairly simple application, we have to go back - all the way back, in fact, to the beginnings of our very first technological innovation, controlling fire.  

Consider what life must have been like for human beings before our controlled use of fire, somewhere in the neighborhood of 500,000 years ago.  Of course, cooking food (and thereby making it considerably more digestible and less dangerous) was one tremendous benefit, but let's take a look at the "controlled light" aspect and think about how important that was.  Any time night came along, people would be unable to see effectively, since our eyes are not well adapted to night vision, but many of our natural predators do have eyes that could see well in the dark (as is true of our prey).  With fire, our way of life changed irrevocably, and people now could conceivably find their way around at all hours of the day.

Of course, having fire in caves ultimately led to fascinating cave paintings of 50,000 years ago or so, and art must have been evolving for a long time, so people must have begun taking full advantage of the first artificial light long ago.  For tens of thousands of years, the torch ruled the "artificial light" sphere - little more than a tree branch on fire at the tip - but eventually, this concept led to a new paradigm:  the oil lamp.  This was dramatically superior to the torch insofar as the light was continuous, and far less flickering; and it would last much, much longer.  This involved at least three converging technologies:  

  1. Controlling fire itself, as described above
  2. Storing the fire in something that would not burst into flames (such as carving out a stone, or using a shell)
  3. The discovery that oil burned

Oil lamps owned the arena of artificial light for 10,000 years, and possibly for longer.  Then, around 2000 years ago, a stunning new technology emerged from China:  the candle.  Candles would last a long time and produce a uniform flame, and they were pretty safe, too, but overall, they were really pretty crappy until the 1700s, when sperm whale head fat (I can't make this stuff up, folks) was found to be dramatically better, slower burning, and form-holding.  That was a mini-game-changer as well, and folks could work into late hours in the night.  

Finally, around 1900, what might be thought of as a "modern" flashlight was invented, with the dry cell battery having been invented about a decade earlier, meaning that truly portable electronic devices could arise.  The flahslight had a few obvious edges over, say, the candle:  it was very unlikely to set your house on fire, you could turn it on with the flip of a switch, and there wasn't any odor or smoke that arose from it.   Clearly, without the dry cell battery, the flashlights we all grew up with wouldn't have existed.  The dry cell itself was just an improvement upon the battery itself, though, so it was "standing on the shoulders" of Ben Franklin, Luigi Galvani, and Alessandro Volta, who all played various roles in inventing the battery (although Volta generally gets the credit).  

Of course, without the metal to store the batteries, which owed a debt of gratitude to the first humans who started playing with iron, maybe 6000 years ago, and eventually learned how to melt it down, separating it from the other elements.  And so it goes.  And your smartphone's flashlight app probably uses a touchscreen activation button, which, of course, owes a debt of gratitude to all of the technologies underlying this innovation, going all the way back to Xerox Park from the late 70s (and well beyond that).  

Studying the past

This is precisely why it's so fascinating for me to learn about the history of discoveries and inventions, something I've focused on for the last decade or so.  Thanks to the Internet and the world wide web (thank you Tim Berners-Lee, Vint Cerf, DARPA, and everyone else who led up to that!), I've been able to do a lifetime's worth of research any time I want to look something up on Google and spend a day or so learning about it, up to and including watching a documentary about many of the subjects that fascinate me.  The history of the Internet itself is a subject that everyone should be at least a little bit interested in, given how (relatively) new it is, and how dramatically it has impacted everyone's lives.  Going back and learning how we got from there to here, you're bound to see a lot of standing on the shoulders of giants, to be certain, whether you're looking at technology and its accelerating innovations, discoveries in physics, or just about any other field you can think of.  We're all connected today in a way that we've never before been, and this connectivity is only increasing (dramatically so!) with time.  Instead of climbing up on top of giants, today we are jumping on a trampoline to get up there, then someone is rapidly springboarding past us, ever upward.

Today - right now - is the very best time to be alive, ever.  We are at an amazing juncture in human history, where the world is not only connected as never before, but innovation is fostered and encouraged.  If something is discovered today in India, you'll know about it 30 seconds later in Indiana.  It is something I still have to stop and think about in wonder, and never, ever take for granted. 



Add a new comment - No HTML
You must be logged in and verified to post a comment. Please log in or sign up to comment.

Explore InfoBarrel

Auto Business & Money Entertainment Environment Health History Home & Garden InfoBarrel University Lifestyle Sports Technology Travel & Places
© Copyright 2008 - 2016 by Hinzie Media Inc. Terms of Service Privacy Policy XML Sitemap

Follow IB Technology