the cyberplane
Of all areas of technology developed by humankind, none evolved nearly as quickly as digital media and cybernetics, to the point that, to better study it, one must divide its history into three periods: the First Stage, the Immersion Stage and the Merging Stage.
The First Stage comprises the first decades of the Internet's invention: it became increasingly more important for society as a whole, and increasingly easy to access, be it through PCs, smartphones, notebooks, netglasses or even holographic wristwatches. Widely agreed to be the defining feature of this era is the fact that, in those years, the digital world was like a world “on the other side of a mirror”: you could see into it, but never “enter it” – in other words, one could not “feel” through it. Another feature of that age was that accessing the Internet was not always easy or even possible, as people relied on a signal that wasn't always available to them.
And then, in the decade of 2030, came the invention that heralded the Immersion Stage. It was a device that allowed a user to receive information with all five senses; for instance, if watching a documentary about a rainforest, the user could not only see images and hear sounds, but also feel and smell the environment.
More than that: people could now virtually touch each other – and anything else for that matter – through the Internet. This device – dubbed “ohkan”, Japanese for “crown”, due to its appearance – worked by connecting directly to the brain, transmitting sensorial information straight to the brain areas responsible for each of the five senses.
This quickly made way for a further expansion to this technology: it allowed people to access the Internet with their minds, no longer being forced to use mouses, clicks or even PCs and smartphones. However, the ohkan demanded that users remained physically inactive while using it, and so other electronic devices were still widely used.
A direct consequence of this new technology was the establishment of the Cyberplane: virtual environments (servers) where people could dwell with “digital bodies” (or avatars), meeting other people, visiting virtual versions of material-world places (that was when cybertourism was born), playing games, accessing virtual libraries...
Still in the first years of the Immersion Stage, stable and strong Internet signals became as easily available, common and necessary as electricity itself; in most cities of the developed world, anyone, at any corner, could access the digital world.
One huge social consequence of the Immersion Stage's advent was how it changed the way people interacted. During the First Stage, even though people often talked to each other through texting or voice messages, they still needed to physically meet each other in order to have a greater level of intimacy.
In the Immersion Stage, however, people could interact with each other in the Cyberplane just as well and thoroughly as they would in the material world. Many people still resorted to physical meetings for a safer (as in more private) interaction, as the Cyberplane environments were vulnerable to hacking and “hackpeeking;” but the truth is that people became generally more individualistic from then on.
The First Stage comprises the first decades of the Internet's invention: it became increasingly more important for society as a whole, and increasingly easy to access, be it through PCs, smartphones, notebooks, netglasses or even holographic wristwatches. Widely agreed to be the defining feature of this era is the fact that, in those years, the digital world was like a world “on the other side of a mirror”: you could see into it, but never “enter it” – in other words, one could not “feel” through it. Another feature of that age was that accessing the Internet was not always easy or even possible, as people relied on a signal that wasn't always available to them.
And then, in the decade of 2030, came the invention that heralded the Immersion Stage. It was a device that allowed a user to receive information with all five senses; for instance, if watching a documentary about a rainforest, the user could not only see images and hear sounds, but also feel and smell the environment.
More than that: people could now virtually touch each other – and anything else for that matter – through the Internet. This device – dubbed “ohkan”, Japanese for “crown”, due to its appearance – worked by connecting directly to the brain, transmitting sensorial information straight to the brain areas responsible for each of the five senses.
This quickly made way for a further expansion to this technology: it allowed people to access the Internet with their minds, no longer being forced to use mouses, clicks or even PCs and smartphones. However, the ohkan demanded that users remained physically inactive while using it, and so other electronic devices were still widely used.
A direct consequence of this new technology was the establishment of the Cyberplane: virtual environments (servers) where people could dwell with “digital bodies” (or avatars), meeting other people, visiting virtual versions of material-world places (that was when cybertourism was born), playing games, accessing virtual libraries...
Still in the first years of the Immersion Stage, stable and strong Internet signals became as easily available, common and necessary as electricity itself; in most cities of the developed world, anyone, at any corner, could access the digital world.
One huge social consequence of the Immersion Stage's advent was how it changed the way people interacted. During the First Stage, even though people often talked to each other through texting or voice messages, they still needed to physically meet each other in order to have a greater level of intimacy.
In the Immersion Stage, however, people could interact with each other in the Cyberplane just as well and thoroughly as they would in the material world. Many people still resorted to physical meetings for a safer (as in more private) interaction, as the Cyberplane environments were vulnerable to hacking and “hackpeeking;” but the truth is that people became generally more individualistic from then on.
The Immersion Stage lasted roughly from 2030 to 2130. The ohkans evolved into smaller, more efficient and more complete devices which greatly improved the Cyberplane experience – to such extent, in fact, that dwelling in it felt as realistic as if one was dwelling in the material world (or, as it came to be called by teenagers and then by everyone else, the "matworld").
But that was far from being the only invention of that age; the Internet of Things evolved too, to the point that nearly every item used in developed and developing countries had electronic components and was connected to other items and to remote controls; “intelligent houses”, run by domestic AIs and which could be controlled from afar, became increasingly commonplace; everything and everyone in the world was, quite literally, interconnected.
Furthermore, the Immersion Stage's innovations had a deep impact on education. Now that basically everyone had access to literally everything that was known to humankind, teachers lost their ancient position as transmitters of knowledge. That did not mean that they lost their jobs; on the contrary, teachers became responsible for helping their students to choose and process the massive information they had available, guiding their pupils as they built their own knowledge.
In those years, educators became known no longer as teachers, but as guides and masters. And the students, who now actively built their own understanding of the world with their masters' aid, grew to be much more critical and aware of global and individual issues than their predecessors.
However, the monumental access to information had a negative consequence too: around the second half of the XXI century, psychiatrists raised awareness for what became known as the “illness of the century”: sensorial-cognitive overload, or the excess of information of all kinds to which the human brain was subjected in those days, and which it was yet incapable of properly processing. People diagnosed with sensorial-cognitive overload were perceived as extremely anxious and agitated, suffering from insomnia and exhaustion, and with great difficulty in concentrating on a single subject for more than fifteen minutes. Many treatments were developed to cure that, but none was very useful, for the very cause of the problem was digital technology itself.
Only in the first decade of the XXII century that a cure for the illness was created – not by medical sciences, but by bioelectronics. Scientists developed a microprocessor that could be successfully connected to certain brain areas – especially those most responsible for memory-making – through a human-machine interface. This microprocessor (known as “chibrain”, from “chibi”, Japanese for “small”, and “brain”) was designed to store every information recorded by the brain like a small computer; and, just like a small computer, its information storage could be accessed as if the person was opening folders or using a search engine.
It was installed on the back of the head, nailed on the skull, and encased in a thin shell of titanium and a special type of rubber made to absorb great impacts. The second series of chibrains were implemented with the ability of containing “thought processes”: these processes could be any line of thought started by the person, from a problem to be solved to a conversation, which could be stored in a process slot only to be accessed and continued later on.
This was considered one of the most revolutionary and controversial inventions in Human History. Controversial because the chibrains would lead humans one step closer to mechanization – and, at the time, people still had great prejudice against high levels of cyberization (greatly due to the Android Crisis); and also because there was a huge concern that, if only people with money were able to install chibrains on themselves, that would give them too much of an advantage over less privileged people. The first controversy was dealt with as time went by, while the second was overcome when the countries, fostered by the UN, made chibrain implants available to every willing citizen.
But that was far from being the only invention of that age; the Internet of Things evolved too, to the point that nearly every item used in developed and developing countries had electronic components and was connected to other items and to remote controls; “intelligent houses”, run by domestic AIs and which could be controlled from afar, became increasingly commonplace; everything and everyone in the world was, quite literally, interconnected.
Furthermore, the Immersion Stage's innovations had a deep impact on education. Now that basically everyone had access to literally everything that was known to humankind, teachers lost their ancient position as transmitters of knowledge. That did not mean that they lost their jobs; on the contrary, teachers became responsible for helping their students to choose and process the massive information they had available, guiding their pupils as they built their own knowledge.
In those years, educators became known no longer as teachers, but as guides and masters. And the students, who now actively built their own understanding of the world with their masters' aid, grew to be much more critical and aware of global and individual issues than their predecessors.
However, the monumental access to information had a negative consequence too: around the second half of the XXI century, psychiatrists raised awareness for what became known as the “illness of the century”: sensorial-cognitive overload, or the excess of information of all kinds to which the human brain was subjected in those days, and which it was yet incapable of properly processing. People diagnosed with sensorial-cognitive overload were perceived as extremely anxious and agitated, suffering from insomnia and exhaustion, and with great difficulty in concentrating on a single subject for more than fifteen minutes. Many treatments were developed to cure that, but none was very useful, for the very cause of the problem was digital technology itself.
Only in the first decade of the XXII century that a cure for the illness was created – not by medical sciences, but by bioelectronics. Scientists developed a microprocessor that could be successfully connected to certain brain areas – especially those most responsible for memory-making – through a human-machine interface. This microprocessor (known as “chibrain”, from “chibi”, Japanese for “small”, and “brain”) was designed to store every information recorded by the brain like a small computer; and, just like a small computer, its information storage could be accessed as if the person was opening folders or using a search engine.
It was installed on the back of the head, nailed on the skull, and encased in a thin shell of titanium and a special type of rubber made to absorb great impacts. The second series of chibrains were implemented with the ability of containing “thought processes”: these processes could be any line of thought started by the person, from a problem to be solved to a conversation, which could be stored in a process slot only to be accessed and continued later on.
This was considered one of the most revolutionary and controversial inventions in Human History. Controversial because the chibrains would lead humans one step closer to mechanization – and, at the time, people still had great prejudice against high levels of cyberization (greatly due to the Android Crisis); and also because there was a huge concern that, if only people with money were able to install chibrains on themselves, that would give them too much of an advantage over less privileged people. The first controversy was dealt with as time went by, while the second was overcome when the countries, fostered by the UN, made chibrain implants available to every willing citizen.
As for the changes brought about by that invention, they are many. First of all, it brought an end to the “illness of the XXI century”; second, the chibrains greatly augmented humans' cognitive abilities, arguably ushering them one step ahead in human evolution. The third great change was brought about by the fourth chibrain series, the first to enable an individual to directly connect to the Internet – including the Internet of Things – through it. Just like with the chibrain's other abilities, using it required training, but it eventually allowed the individual to navigate through the Cyberplane, to search for information and even communicate to others without any other medium.
Also, each chibrain from that series onwards (older series' chibrains could be updated to newer ones) had a unique code imprinted into it, much like an old computer's IP number; it became another mark of one's identity, as unique as one's fingerprints.
Yet, with all its importance, the chibrain was just one of the two main inventions that revolutionized both the Cyberplane and the matworld in the XXII century. In the decade of 2130, advances in the technology of claytronics – a mass of nanorobots that can take on virtually any physical shape –, which had been in use for nearly a century by then, made it possible for people to physically project themselves on special environments through the Internet.
That's because, even though the Cyberplane allowed people to interact with each other and with environments, these were all still virtual; with the new claytronics, however, one could project oneself into a matworld environment and physically interact with it – meaning, for instance, that a surgery could now be performed from far away, something not possible with only holographic projections. Claytronic projections of people were called Cybermat avatars (a portmanteau of the words “cybernetic” and “matworld”), and places built specially for this technology were called Cybermat environments.
Around 2150, as there was virtually no one on the Earth or the Moon or the Star Rings without a chibrain, and with the Cybermat technology becoming widespread, the new era of cyber-history began: the Merging Stage, when every person was permanently connected to the Internet and to everything and everyone else that was too – for a chibrain's removal was lethal. External devices were never completely abandoned, as navigating the Cyberplane through the chibrain for too long, at the same time it dealt with matworld issues, was exhaustive even to the best trained chibrain user; but that did not change the fact that the digital had finally and inexorably merged with the material.
The chibrain did bring new problems to health as well. The first series sometimes malfunctioned, producing electric discharges which, albeit tiny, caused great stress in the brain. Later series managed to overcome this and other issues, but many specialists theorize that, soon enough, a full cyberization of the human brain will be necessary for it to comfortably and efficiently contain all of the information that, in the world of the XXIII century, it is subjected to; but that remains to be seen.
Also, each chibrain from that series onwards (older series' chibrains could be updated to newer ones) had a unique code imprinted into it, much like an old computer's IP number; it became another mark of one's identity, as unique as one's fingerprints.
Yet, with all its importance, the chibrain was just one of the two main inventions that revolutionized both the Cyberplane and the matworld in the XXII century. In the decade of 2130, advances in the technology of claytronics – a mass of nanorobots that can take on virtually any physical shape –, which had been in use for nearly a century by then, made it possible for people to physically project themselves on special environments through the Internet.
That's because, even though the Cyberplane allowed people to interact with each other and with environments, these were all still virtual; with the new claytronics, however, one could project oneself into a matworld environment and physically interact with it – meaning, for instance, that a surgery could now be performed from far away, something not possible with only holographic projections. Claytronic projections of people were called Cybermat avatars (a portmanteau of the words “cybernetic” and “matworld”), and places built specially for this technology were called Cybermat environments.
Around 2150, as there was virtually no one on the Earth or the Moon or the Star Rings without a chibrain, and with the Cybermat technology becoming widespread, the new era of cyber-history began: the Merging Stage, when every person was permanently connected to the Internet and to everything and everyone else that was too – for a chibrain's removal was lethal. External devices were never completely abandoned, as navigating the Cyberplane through the chibrain for too long, at the same time it dealt with matworld issues, was exhaustive even to the best trained chibrain user; but that did not change the fact that the digital had finally and inexorably merged with the material.
The chibrain did bring new problems to health as well. The first series sometimes malfunctioned, producing electric discharges which, albeit tiny, caused great stress in the brain. Later series managed to overcome this and other issues, but many specialists theorize that, soon enough, a full cyberization of the human brain will be necessary for it to comfortably and efficiently contain all of the information that, in the world of the XXIII century, it is subjected to; but that remains to be seen.