The Inevitable Singularity
The Inevitable Singularity,
Transhumanism: Crossing the Human Threshold,
Blurred Lines of Biological & Mechanical Systems
First-Year Composition Class # 29120
Life as we know it, at least from our own perspective experiences, is already an amazing miracle. Between all forms of life and the different branches of species in the tree of life, there can be drastic similarities or differences between these varying forms of life. Time tends to change things, evolution and adaptation are key to surviving or even thriving in a certain environment. These are common occurrences throughout nature and our own physical reality. Now, set the stage for a biological life form to come along, one with natural curiosity but also with cunning, awareness, consciousness, intellect, insight, innovation, determination, the list goes on. Over time this life form begins to create machines and computers and robots, but like all exponential functions, the progress and amount of information continues to double at an ever rapidly increasing pace. We have now become passengers of this train ride, continuously barreling down the track with our speed increasing, soon it will be too fast to turn, slow down or stop and the tracks will end, the inevitable loss of control, this potentially catastrophic point of no return is aptly named 'The Singularity' or often 'The Technological Singularity'. This hypothetical point in the future is when the runaway reaction of self-improvement or upgrades to this intelligence or technologically 'aware' agent will have an uncontrolled and irreversible impact with unforeseeable consequences and changes to our very own human existence. This scenario is one of the more popular and well-known variations, especially due to the terrifyingly surreal and all too close for comfort aspect of where we are at currently in this age or era and where we are headed in terms of reality. How can we implement protocols, safeguards and limitations in the abstract and emotional departments of human experience? For instance, morality, ethics and metaphysical ideas/thoughts can lead to some worrying future outcomes. The "awareness" factor in conjunction with exponential superiority can be a troublesome thought as well.
There are many aspects of human civilization and pop culture, Hollywood, film, movies and television series as well as books and novels even describe events that could potentially happen. For decades, even up to the last century, people have told stories about these kinds of events taking place in the future, in some way or another and it shows the fall of humanity to its own creation, Artificial Intelligence. Prior to that, myths and legends were tales of adventure, action, suspense, drama. They taught lessons and morals on what good behavior, lifestyle and priorities should be focused on, a training tool of sorts in ethics.
As a species and society, regardless of one's own thoughts on the matter, there have been multiple discoveries and evidence-based facts and conclusions showing previous human civilizations and the cultures that these societies lived by. Despite the evidence-based theories or scientific hypotheses on accurate dating scales, it is widely accepted that humans have been here on earth for at least thousands of years, based on the great civilizations of past cultures, ones that dominated the known world during their time. Different empires have ruled during times and conquered regions of the world, the Sumerians, Akkadians, Egyptians, Ottomans, Persians, Asian Dynasties/Empires, Mayans, Aztecs, Incas, Romans, French, English, Americans, the list goes on. Some of these civilizations spanned hundreds or even thousands of years, these exponential growth functions of population show that it took hundreds and thousands of years to discover and invent ideas or tools and methods to overcome obstacles, either in construction, language, mathematics, engineering, etc. However, the last century alone has shown the largest growth patterns in human history, with most coming from the last few decades of research and informational discoveries alone.
There are mixed definitions and reactions, but how long did the ancient Neanderthals wander and roam the earth? Thousands or tens of thousands of years, maybe longer, according to radiometric carbon dating techniques from preserved prehistoric lifeforms, animals and humans alike. After all this prehistoric time, some of the ancient civilizations just mentioned came into fruition. Even these city/states have been dated at two to five thousand years old, up until around the middle ages, the Dark Ages, The Renaissance and the Age of Discoveries, these times still lasted centuries. Finally, the closer we get to our own current age, the Modern periods of history begin to present themselves and as the discoveries begin to be known. These periods begin to show that the power and control of the Catholic Church begin to wane and as industry, with science, technology and innovations begin to slowly erode the trust and faith away from the general populace as more discoveries are shown and explain to people what is actually happening scientifically, with evidence and proof of how and why things are the way they are.
As the author Arthur C. Clarke once stated, "Any sufficiently advanced technology is indistinguishable from magic." If a person from the past was shown a phone, vehicle, aircraft, radio, TV, etc., from our own current society, they might recoil in fear or speak of wicked evil magic and dark demonic forces, but for someone living in the current world, it would be nothing special or remarkable, a technological innovation that is known and understood. The displayed fear is taken from the unknown and misunderstood. Since these topics have been researched and explored for better understanding, over time this came to be known as science. The natural laws and states of matter and their connections for the building blocks of reality. So where did this path begin and why is it such a hot topic today?
The idea of a mechanically constructed tool, or machine first came to light over 200 years ago, around 1812 by an English inventor and mathematician Charles Babbage. After much debate, deliberation and questing for funding of such a project and idea, the Difference Engine was constructed, later improved upon and upgraded into the Analytical Engine, the forerunner of what now is known as the modern digital computer, where logic, operational functions and memory storage was introduced and expanded upon. These first computers were extremely large, bulky, and had many inherent faults and issues as these ideas were experiments and always a work in progress. Humans adapt, overcome obstacles and problems that present themselves upon the path of discovery and enlightenment of the scientific world. Why should human inventions be any less adaptable and evolutionary when nature itself completes these kinds of actions as well. Later, Alan Turing, a famous English mathematician, crypt-analyst and computer logician, whom helped the English government crack the German Nazi codes and The Enigma Machine to help the Allies win against the Axis of Evil during World War II.
These first machines and computers were rather large and cumbersome, often times needed custom made pieces and parts, some of which were designed and made by these individuals themselves as government and private funding were really the only way to get some of these ideas off the ground. Over a century later from the first Differential Engine, the Supercomputers that were made, spanning a period of over 50 years from the 1940's to even up to the 1990's were extremely large, taking up entire rooms and even entire buildings. These were designed and constructed by the largest companies, corporations and government agencies, especially for "national security" purposes, among other reasons. Even these large machines were prone to issues and breakdowns, so customized that it took entire agencies, teams and companies to work through these problems and develop better, faster and smaller machines/computers. In fact, many popular terms and definitions have sprouted from the field of computers and robotics. For instance, the term "computer bug" was first recorded in September of 1947 and was in fact a real-life insect, a moth. The moth was removed from the circuit board connections and the first "debugging" was performed.
As the Industrial Revolution began to reshape the social and economic world in the mid 1700's with the introduction of inventions and ingenious innovations, like the steam engines and power looms and manufacturing mechanisms that were created to help ease job strain and repetitive tasks for humans, these items helped pave the path that led to the advent of the first computers and machines. This transition eventually began the boom of the Information Age, a term synonymous with the label of Computer/Digital Age. Before these computers, it was calculated to take a roughly 15-year period for the capacity of a library to double with its contained information and knowledge and this was in 1945. Twenty years later in 1965, Moore's Law, which states the observation that the transistor capacity in a microchip and therefore the computers processing power would be projected to double in size, roughly every 18 months. Which in this case is an exponential growth function, not at all unsimilar to virulent and pathogenic biological agents, similar to the current global pandemic we are currently in, so far, with COVID-19. As with this rather grim example however, and if not all, but most sciences, the more data is involved, the more algorithms and inferences can be made. These types of computations require machines to condense, process and extrapolate meaningful information from all the excess and erroneous data, at least maybe to humans it is erroneous or meaningless.
In the foreseeable future, if the world doesn't decide to end, from whatever source that may be, Moore's Law will eventually close the chapter on this era of computational processing power. The current computers, machines and robots of today will be archaic and outdated when it comes to the new age of robotics, computers and the machines that the computers are building even today, not to mention in the future. Currently, these transistors on the microprocessors of computers can only be made so small. Some of the first integrated circuits, more commonly known as microchips, were much larger (roughly the size of an adult's fingers) and had only a few transistors, resistors and or capacitors. Today, the ever-shrinking miniaturization of electronics and discoveries of new matter and how to control this matter, like the electrons in a computer being used to process information, these microchips are now smaller than some of our currency's smallest coins. A microchip can be the size of a penny or dime and has hundreds of millions of transistors contained within it. That number is only going to continue to climb through the roof as well, to billions and trillions of transistors in the largest of today's (conventional) supercomputers. Like the information and scientific research being carried out today, it is known that there are many scales of magnitudes smaller than just the elemental atom or makeup of matter.
As we fall deeper into the subatomic and quantum mechanical realm of science and matter, the conventional computer will eventually fade like all inventions over time and the quantum computers will become more popular and widespread. Like past innovations like the radio or television. Then personal computers became available for commercial use, not just for businesses anymore, and eventually followed by the laptop, then tablets and the smart phones of today, in fact you are carrying around a device in your pocket right now, the smart phone, that is many magnitudes stronger than even the worlds best supercomputers of the 70's, 80's, 90's and 2000's. The humans walking around today are the first pre-cybernetic organisms or cyborgs, like the ones Hollywood likes to depict in their movies. When information was needed, the library and the academic communities were the ones that were turned to. While this still remains the case, at least in the academic community aspect, there are still specialists and professionals whom have studied their craft for years or even decades, making it their life's work. Now, search engines and databases are where people turn to, when designed algorithms and logic, probability and statistics are all mass analyzed, nearly instantaneously at least in human standards, in order to show the interconnection of terms that are now common like "big data". Pacemakers, Cochlear implants, artificial hips and knees are some of the innovation's humans have come up with to fix problems that present themselves, however with the Trans-humanist movements that have been on the rise, more so now than ever before are the people and companies behind research and development for technologies for human-enhancement. The main goal of the Trans-humanism movement might be considered one that is to ease the natural state of pain and suffering, however the adage "The road to hell is paved with good intentions." is what comes to mind when you think of the reasons behind an ideal or meaning. These enhancements and modifications may very well help the health and longevity of human lifespans, and even drastically increase our own capacities for our senses, perceptions, cognitive intellect, etc. These fields are so numerous with so many individuals and organizations working and collaborating on these types of problems and inevitably lean towards the convoluted dissemination of information. Even when there is oversight and monitoring of these types of activities, it is often times misunderstood and even not known what could happen, therefore usually these types of experiments and designed simulations are done in controlled environments in case anything were to go horribly wrong. It is only a matter of time however before a mistake is made and the system designed, outsmarts its maker in every way and breaks free of its constraints and limitations put onto it by its engineers and designers.
However, these limitations and constraints aren't sometimes enough. Then there are issues like biohacking in which a more common and everyday approach to this issue and essentially is a DIY biology and is dangerous, you do this activity at your own pace and you can determine the lines that you may or may not be willing to cross, your own levels of risk vs. reward. It can be as innocent as taking vitamins and supplements, but can also be as intrusive as implants for a specific reason, or like an RF-ID chip, which is a completely controversial issue, in and of its own accord, in which some consider this the proverbial "Mark of the Beast". Religious and Spiritual aspects aside, when fields of study are taken into their own hands, oversight is thrown out the window and a person's own logic (or madness) drives them towards their own goals. Scientists in every field are working on these problems as well. Even subsets within AI have been helping lead the way in terms of machine learning and neural networks (taken from examples of real-life lifeforms (neural or brain design) and can execute these functions exponentially faster. In terms of human understanding, is still a hard concept to grasp, yet computers are doing these types of calculations on unfathomable amounts of data and correlating inferences.
The possibilities are just as endless as the numbers involved, even nations are starting to become ever more dependent on electronics and the "connectedness" or the extremely fast growing IoT or 'Internet of Things' that enables communications for humans to use. The superpowers of today are constantly striving towards safety and peace with prosperity and freedom, but like movies can entail a rather dark side to the benefits of what all these breakthroughs will bring. Films like Terminator, iRobot, and The Matrix, seem so farfetched that its hard to comprehend, but bleak outcomes are very much a reality. Even the US Department of Defense is working on a war fighting cloud to interconnect all aspects of defense as well as offense capabilities and integration, it is being created for General Purpose Enterprise Cloud, but like all government agencies and really awesome backronyms, is quaintly being labeled JEDI or the Joint Enterprise Defense Infrastructure. Even if there are oversight committees that make sure certain protocols and safety measures are put into place, how would those reports make it to the public's attention, if at all. For national security reasons, the public isn't even aware of projects that are being worked on, either past, presently, or in the future. Even big corporations like Facebook have had to shut down their AI projects before, once the two 'chat bots' began negotiations with each other, creating their own shorthand language that was unintelligible to humans. Even products like Neuralink, started by Elon Musk (Tesla, Space X, PayPal) plan to begin integrated testing on humans, after testing on monkeys was deemed "successful" and allowed the monkey to control the computer with his brain. These Neuralink chips have thousands of microscopic fibril tendrils (a quarter the width of a human hair) and the ends of these probes are inserted deep within the brain, as opposed to wires that are implanted currently near the surface of the brain for monitoring and stimulation when required, in the case of those partially paralyzed, or those that suffer from seizures and other brain injuries. Even more frightening is with the emergence of 5G and its harmful effects on lifeforms, these chips are wireless once implanted and interface with an app, like you would on your phone for a game. Movies like Gamer and Upgrade come to mind, where control of a human is given to a computer, or more frightening, another human being, which of itself is akin to business and finance, no one wants to risk their own money, but someone else's is fair game, well lets equate that to ones own LIFE instead of finance, and we can see the dark side of this idea very quickly and easily. Are there aspects that are positive and beneficial, absolutely without a doubt, however, the flip side is that there are always corrupt and dark people and ideas that will be more than happy to take from others for their own benefit.
In conclusion, there are many different fields in which there can be benefits to society and life, liberty and the pursuit of happiness. There are however those individuals that try to rule through brute force and power projection, not just to nations, but to people as a race. With global issues on the rise, as the planets ice sheets begin to melt, permafrost begins to release ancient gases, as pollution and toxicity levels rise and kill off entire ecosystems, as we harvest fossil fuels and geothermal energy, or as we begin to look to the stars and want to reach out for the cosmos, the proverbial "Don't keep all your eggs in one basket" comes to thought, if something were to cause some kind of Extinction Level Event, it would behoove us as the only sentient life form discovered (thus far) to spread out and keep our own race going. The aspects of creating an Artificial Intelligence show promise, but the limitations so far will only begin to solve themselves as time moves forward, will it be for the benefit and livelihood of the human race? Or will it be the creation that spurns our downfall and extinction? Only time will tell.
Bergstein, Brian. The Great AI Paradox. MIT Technology Review
Chiu, Eric. Facebook AI Project Generates Own Language, Chat Transcript Baffles Humans. International Business Times
Cloud Computing Program Office. Enterprise Cloud US Department of Defense
Editors of Biohackability. What Is Biohacking? Biohackability
Editors of Encyclopedia Britannica, Inc. Charles Babbage British inventor and mathematician. Encyclopedia Britannica, Inc.
Griffin, Andrew. Facebook's artificial intelligence robots shut down after they start talking to each other in their own language. The Independent
Hays, Sean A. Transhumanism Social and Philosophical Movement. Encyclopedia Britannica, Inc.
Hemmendinger, David, Pottenger, William Morton, Swaine, Michael R., Freiberger, Paul A. Computer System. Encyclopedia Britannica, Inc.
Hope, Computer. When was the first computer invented? ComputerHope
Kania, Elsa B. Great Powers Must Talk to Each Other About AI. DefenseOne
Kilby, Jack & Noyce, Robert. The History of the Integrated Circuit (Microchip). ThoughtCo
Pearce, David. The Biointelligence Explosion: How recursively self-improving organic robots will modify their own source code and bootstrap our way to full-spectrum superintelligence. Hedweb
Selyukh, Alina. After Moore's Law: Predicting The Future Beyond Silicon Chips
Shankland, Stephen. Elon Musk says Neuralink plans 2020 human test of brain-computer interface
Veldhuijezen Van Zanten, Boris. The very first recorded computer bug. The Next Web