With a new year already upon us, most of us have new expectations and seek new opportunities in the horizon. New expectations to assess our personal, family and communities’ status during and beyond 2023. All in an epoch where everything changes at the speed of light, or in consonance with technological advancement.

Before we achieve anything though, we need to set a solid base upon which this future will be built. It is always honourable to acknowledge and make our second nature the phrases: (a) I do not know, (b) I am happy to fail trying, and (c) I love what I will be working on, not necessarily in that order.

Also, we should endeavour to find what medium we will use to carry us over to our much sought-after goals. Usually, if not always, new technology plays an instrumental role in individual or institutional progress. We generally look at technology to identify what is defining today’s computing revolution and industrial generations.

Scientifically the fact is we are already past the fourth generation in computing, which is being discussed across the globe as the era where technology has reached an acceptable stage; computers running on wires and machinery, mechanical equipment interacting with humans. That which really marks the end of fourth generation computing is the invention of robotics. We have been able to achieve a lot more, a lot sharper, a lot safer in recent years, using robots. There is no doubt about that. But often we wonder what the next generation is. Where are we heading? Are we really discovering inventions as we examine and discuss progress, or do we follow a pre-determined plan?

Photo: Alex Motoc/Unsplash

The general view is that technological generations are defined by the following periods (related to but not exactly aligning with industrial revolution generations):

1)     1940 to 1956 when we started with vacuum tubes and ended up with giant calculators.

2)     1957 to 1963 when we introduced transistor technology to replace vacuum tubes.

3)     1964 to 1971 when integrated circuits were introduced (IC) and still used today.

4)     1972 to 2010 when microprocessors were invented (microprocessors are chips inside integrated circuits which do calculations at speeds never seen before). Introduction of first home computers, brought by IBM with input and output devices, like a keyboard / mouse (input) and a monitor / printer etc (output). This generation improved our understanding of what computing power is all about and its use.

5)     2010 to today. What is defining today’s computing revolution and generation is not clear yet. We have not reached the point where we can say we understand exactly where this is taking us. We introduced AI (Artificial Intelligence) to a large scale and quantum computing, things that we don’t fully comprehend yet publicly. It is desirable for the public to understand technological advancement.

I would like to make a important insertion here, with a comment aimed at regulators who define the five technology generations above declaring that there existed computing devices before 1940. As far as I am concerned, regulators decided to call this year as the start of technological generations, because they lacked historical knowledge, or their understanding of ancient advances in technology was very limited. It’s like someone saying, “before my knowledge, nothing existed”. Or the moderate Gorgias’ solipsism which holds that “knowledge of anything outside one’s own mind is unsure or may not exist”.

The question at hand, and if we accept the five generations above, is where do regulators place the Antikythera Mechanism (invented by Hellenes between 220 to 150 BC)? And before that, the Minoan-Era Eclipse calculator (circa 1500 BC).

To explain ourselves better we need to define what is a computer. The way I would define it, a computer is a man-made device that can process and store data. It matters little if it is an electronic, a mechanical or a hydraulic device. In fact, if you google it, you will get this answer: “computer is an electronic device for storing and processing data, typically in binary form, according to instructions given to it…” – Oh binary form, i.e. “on” or “off” bits placed in order to form a byte. Is that when computing started? Or did we just change the definition then? Electronic device? So, if anything non-electronic can process and store data is that not a computer?

Well, the Minoan-Era Eclipse calculator, invented during the Minoan period, was calculating the eclipse of the moon and the sun given the right settings, and as a sundial or an instrument for the determination of the geographical latitude. The Antikythera Mechanism, was predicting astronomical positions and eclipses in advance. Then it is the Astrolabe which precedes the Antikythera Mechanism (invented around 170 BC). Similarly, to the other two, this one could process several challenges in astronomy.

At this point it needs to be said that the above ancient mechanisms were more precisely defined as analogue computers, but they were computers, nevertheless even if not electronic.

Hence in my way of thinking we’re now already past the fifth generation of technology, going onto the sixth, since the first period can be prior to 1940 as defined by regulators or “experts” above.

Let us not squabble about generations though. What we should squabble about is something else, much more important, much more valuable to our future wellbeing. In fact, the general public has no idea what is contemporary in technology today. Most of us look for the next app on our smart devices. The serious need is to know what exactly is being invented these days, behind closed doors and behind guarded buildings with thick concrete walls.

The public has a right to fully understand cryptocurrencies, and what benefit can they bring to our world. The public needs to also understand quantum computing. Why should the likes of Elon Mask have the right to define chips for people’s brains? Who is going to control and even inspect the innermost functions or programs (series of commands) in these integrated circuits? How can it be allowed that a single entity or a person to control knowledge even if they have the funds to support this control?

Traditionally, large organisations have no intention of disclosing information on their invention as invention and technological advancement means research and lots and lots of funding. In this way only the very rich and the very influential can control this revolution, which in my mind is not exactly right; predominantly because it concentrates power over mankind’s progress or demise to the very few, and as such it can easily be manipulated.

But who is going to inform the public and encourage it to find the right answers? Do we need a wider discussion? Should we legislate against any individual or organisation not disclosing their real intentions? Do we have an allegory here like Plato’s cave?

A plethora of questions including how important it is to admit that we don’t need to fight new technology, as computing devices are only tools. A tool in the wrong hands can incite devastating destruction, whereas in the right hands can revolutionise humanity, and significantly speed up progress.

The fifth generation of computing is already upon us then. It has brought with it Artificial Intelligence which we have no idea where it is going to lead us. In a previous article titled “The psychology part of AI” published in Neos Kosmos in September 2021 I noted:

“The biggest challenge lies in two factors:

(1) how humanity will accept such clever tools and

(2) the outcome of the conflict between the human race and AI from which man is already lagging behind in many areas.”

Bitcoin cryptocurrency. Photo: Jievani Weerasinghe/Unsplash

However, back in 2021 I realized far less of technological progress than I now understand. Being in technology development as a profession it is vital to keep up with all technological advances. One does not necessarily need to know everything (man never will be able to know everything). The question is whether machine and AI will be able to accomplish that?

What would be beneficial for the common good since new technology always wins. Whether we accept and use it, finding our own little revolution or deciding to do without it, new technology wins in the end. Additionally, the clash between old and new even if controversial and full of heated exchanges of opinion only prompts for improvements in technology flaws. It does nothing to deter many from using and enhancing it. More to the point, whoever understands and uses the culture of change is a winner.

I do not have the intention, nor the space, of covering cryptocurrencies and blockchain technologies here but I will at some point in the future. Like everyone else I need to accept that “I don’t know” before I have a chance to learn. Many journalists and commentators come out advertising detailed knowledge in blockchain technology matters. Many fault and reject it, most of the time because they use proven failed reasoning and experiences of the past to judge it. New technologies need a fresh approach, not outdated reasoning which failed us in the past.

Challenges in technology do not concern the technological revolution, these belong to the past. And don’t think because cryptocurrencies have been devastated in 2022 it is the end of them. These are significant futuristic mechanisms that (a) we do not fully understand as yet and (b) we have not adequately regulated to make them safe.

Some help is on the way. We have most and the largest countries in the world regulating now. With a simple google search for the European Cryptocurrency Laws which have already been made public, people can access read and understand them. Only a handful of countries still have restrictions on the use of cryptocurrencies. The world is getting ready to accept them.

Australia, on the other hand, is lagging on the government regulation level. The Australian Government has historically taken a minimal intervention approach to the regulation of cryptocurrencies. In fact, they have done nothing much and I feel really waiting for larger countries to regulate first. But it is all incoherent internationally and recognised laws have not been agreed upon; this is the reason we witness such incredible volatility worldwide.

Personally, I have not read all regulation, there are hundreds or thousands of pages, worldwide, but what I have read already I do not like. As one of those legacy IT programmers before the internet even existed, we started our lives with ambitions and a scope to change this world. Finally, we just compromised, allowing technology to lead us into the future. We understood that it is quite difficult to make any direction change. It takes a lot of pain and sacrifice and that it must be a co-ordinated effort. Should we start afresh from ground zero? I still don’t know. Some carry more responsibility than others, we are not all equal in disclosure and advisory responsibility. Do we need to hold those in the know to account?

Then again “we’re trying to regulate the future with past failed assumptions about technology” astrophysicist Manos Danezis is crying, and continues “Those that are in the know, need to inform all. This is democracy”.

*Iakovos Garivaldis OAM is an IBM Certified Solutions Expert – U2 AppDev, Adm