You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

9.7 KiB

Future-Proof Technology

Future-proof technology is technology that is very likely to stay functional for a very long time with minimal to no maintenance, even considering significant changes in state of technology in society. In a world of relatively complex technology, such as that of computers, this feature is generally pretty hard to achieve; today's consumerist society makes the situation even much worse by focusing on immediate profit without long-term planning and by implementing things such as bloat, intentional introduction of complexity, obscurity, dependencies and planned obsolescence. But with good approach, such as that of LRS, it is very possible to achieve.

A truly good technology is trying to be future-proof because this saves us the great cost of maintenance and reinventing wheels and it gives its users comfort and safety; users of future-proof technology know they can build upon it without fearing it will suddenly break.

Despite the extremely bad situation not all hope is lost. At least in the world of software future-proofing can be achieved by:

  • Free (as in freedom) software -- making your source code available, legally modifyable and shareable is a basic step towards making it easy to repair, backup and adopt to new technology (e.g. compile for new CPU architectures etc.).
  • Building on top of already well established, time-tested and relatively simple technology such as the C language or comun. Choosing to use the older standards with fewer features helps greatly as the less-feature-rich versions of languages are always more supported (for example there is many more C89 compilers than C17 compilers) and can even be relatively simply reimplemented if needed. Another example is e.g. OpenGL -- you should use the oldest (simplest) version you can to make a program better future proof.
  • Minimizing dependencies to absolute bare minimum and offering alternatives and fallbacks in cases where you can't avoid introducing a dependency (e.g. you should always offer an option for software rendering in any program that by default uses GPU for 3D graphics). Dependencies are likely the single greatest cause of software death because if one of your dependencies dies, you whole project dies, and this goes recursively for all of the dependencies of the dependencies etc. This usually means software libraries but also goes for other software such as build systems and also hardware dependencies such as requiring GPU, floating point, special instructions etc.
  • Practicing minimalism and reducing complexity which minimizes the maintenance cost and therefore raises the probability of someone being able to fix any issues that arise over time. Minimalism is necessary and EXTREMELY important, bloat will make your program very prone to dying as it will depend on a big community of programmers that maintain it and such community will itself always be very prone to disappearing (internals disagreements, stopped funding, lose of interest, ...).
  • Making your program portable -- this ensures your program can be adapted to new platforms and also that you use abstractions that untie you from things such as hardware dependencies.
  • Generally just avoiding the hyped "modern" "feature-rich" (bloated) technology arising from the consumerist market.
  • ...

Just think. To see how likely your program is to die in short time just ponder for a while: what parts is it composed of and how likely is each of them to continue functioning and be fixed if it breaks? It's really like with a house or car. Notice that probability of breaking increases with complexity and probability of fixing decreases with complexity (because a fix has a higher cost -- it needs more time, energy, skill and so on). Is your program written in a simple language for which there already exist 10 compilers and which can be implemented again in a month if needed? Then the program is not likely to die by compiler or anything that may kill a compiler, such as a different operating or a new CPU architecture. Is it written in a 5 year old language that's still changing under your hands, has a single compiler and which itself relies on 100 highly complex libraries? Chances of death are great, it is certain your program will break with the next update of the language, or the one after that, you'll have to be around to fix it, and then a month later and then another month and so on until you die, for every program you have written in this language. Does your program only need two libraries, both of which can easily be replaced by something else by only rewriting 10 lines of code? Then your program likely won't die because of these libraries! Does your program use 10 very convenient but complex libraries, each of which additionally uses 10 other libraries itself? In a simplified way you can see your program depending on 100 other libraries now, if a chance of one such library breaking during one year is 1%, the chance of your program breaking in one year is 1 - 0.99^100 ~= 63%; if it doesn't break this year, then the next or the one after that -- yeah, someone will likely fix a library that breaks, but maybe not, projects get abandoned out of burnout, boredom, in favor of new ones etc., and a fix of your dependency may also come with the need for you to be around and update your own program because of API change. Does your program depend on piece of consumerism hardware that in 2 years will stop being supported? Or some specific operating system or Internet service posing similar dangers? This is additional thing on your list to watch, else your program dies. If your program is broken without you being around, how likely will it be fixed by someone? How many people in the world will be capable and willing to fix it? If the program is simple, likely any average programmer will be able to fix it and if the fix takes 10 minutes of time, someone will most likely do it just out of boredom. If your program is 100 thousands lines of code long, requires knowledge of 10 specific framework APIs and its inner architecture just to modify anything of importance, average programmer won't be able to fix it, he won't even attempt it -- if there is someone able to do the job, he won't fix it unless someone pays him a lot of money. Your program is basically dead.

Please take a look at the table below, maybe you'll be able to extract some patterns repeating in software development history:

technology description born dead
Lisp programming langauage 1960 not before you
Forth programming langauage 1970 not before you
C programming langauage 1972 not before you
Objective-C C++ but owned by a corporation, everyone start using this! 1984 2014 (officially)
RPG maker Easily make RPG games just by clicking your mouse! 1992 basically a zombie
Java Applets Make platfotm-independent games and web "apps", a comfortable platform to unify them all! 1995 2017 (officially)
Delphi Comfortable IDE for rapid development of GUI "apps", a platform to unify them all! 1995 on deathbed
J2ME Make mobile games with this multiplatform framework to unify them all! 1999 2015 (de facto)
Adobe Flash Make impressive interactive games and web "apps", a comfortable pltform to unify them all! 2007 2020 (officially)
Facebook Apps Easily make web "apps", a comfortable pltform to unify them all! 2007 2020 (de-facto)
Blender Game Engine Easily make 3D games without even knowing any programming, comfortable platform to unify them all! 2000 2019 (officially)
Unity 3D Easily make multiplatform 3D games! 2005 currently dying
JavaScript + React Easily make impressive web "apps"! 2013 surely won't die, right?
Godot Engine Easily make multiplatform 3D games! 2014 surely won't die, right?

See Also