<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom"><title>frankie-tales</title><id>https://lovergine.com/feeds/tags/rant.xml</id><subtitle>Tag: rant</subtitle><updated>2026-02-25T15:33:03Z</updated><link href="https://lovergine.com/feeds/tags/rant.xml" rel="self" /><link href="https://lovergine.com" /><entry><title>A decent SSH client for Android is not what one would expect</title><id>https://lovergine.com/a-decent-ssh-client-for-android-is-not-what-one-would-expect.html</id><author><name>Francesco P. Lovergine</name><email>mbox@lovergine.com</email></author><updated>2026-02-08T16:00:00Z</updated><link href="https://lovergine.com/a-decent-ssh-client-for-android-is-not-what-one-would-expect.html" rel="alternate" /><content type="html">&lt;p&gt;I’m not too happy with using mobile devices as a daily driver for server
connections. When one needs to use a keyboard, in many cases the appropriate
device is inevitably a laptop or a desktop computer. Anyway, sometimes it
happens  that the mobile phone should be used as an SSH client for an emergency
or to perform simple remote tasks. Possibly using an app that is usable, decently
supported, and can share common configurations among multiple devices with a
reasonable level of security.&lt;/p&gt;&lt;p&gt;For years, the most used client on Android has been &lt;em&gt;JuiceSSH&lt;/em&gt;, but unfortunately,
it became abandonware some years ago, and at the end of last year, it was also
delisted from the PlayStore. A few days ago, I changed my smartphone and
finally moved to the latest &lt;a href=&quot;https://shop.fairphone.com/home&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Fairphone&lt;/a&gt; model.
Now, who knows me already knows how
much I hate mobile phones, which are generally the realm of apps with the worst
UX ever conceived. Since then, I discovered that the pro option of JuiceSSH is
also dead, and basically, SSH forwarding cannot be used anymore. Too bad, I
decided to look for a state-of-the-art SSH client application and discovered
that the generally suggested apps (i.e., Termius and Connectbot) are the usual
PITA.&lt;/p&gt;&lt;p&gt;Thanks gosh, &lt;em&gt;Termux&lt;/em&gt; entered the room.&lt;/p&gt;&lt;p&gt;For people who don't know it, it is an Android terminal that emulates a color
xterm, but has some nice features, specifically a damn good
&lt;a href=&quot;https://wiki.termux.com/wiki/Package_Management&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;package management tool&lt;/a&gt;
built in. Of course, it is pure FOSS.&lt;/p&gt;&lt;pre&gt;&lt;code&gt;pkg install openssh git vim&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Now it could be nice to access the common storage area by enabling it with&lt;/p&gt;&lt;pre&gt;&lt;code&gt;termux-setup-storage&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;This could be useful for exchanging files with remote hosts and keeping them
available for/from other apps.&lt;/p&gt;&lt;p&gt;Now, a useful &lt;code&gt;.ssh/config&lt;/code&gt; file can be pulled via &lt;code&gt;git&lt;/code&gt;, stored locally, and
possibly customized to simplify ssh network access, with all helpful host
stanzas. Even a local password-protected host key can be created and stored
locally for safety.&lt;/p&gt;&lt;p&gt;So far, so good. Probably, I should simply start giving up on treating Android
as a special beast and treat it as just another Linux host. Fewer apps, more
terminal, and fuck the majority!&lt;/p&gt;</content></entry><entry><title>The perfect desktop is a matter of points of view, or not?</title><id>https://lovergine.com/the-perfect-desktop-is-a-matter-of-points-of-view-or-not.html</id><author><name>Francesco P. Lovergine</name><email>mbox@lovergine.com</email></author><updated>2026-01-22T19:40:00Z</updated><link href="https://lovergine.com/the-perfect-desktop-is-a-matter-of-points-of-view-or-not.html" rel="alternate" /><content type="html">&lt;p&gt;I recently learned about an opinionated flavor of the Arch distribution called
&lt;a href=&quot;https://omarchy.org/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Omarchy&lt;/a&gt;, which is basically a collection of desktop
packages built on top of a rolling Arch distribution. Nothing special, but for
the vocal original author of the scripting job at the base of such flavor, who
is, as it happens, for many old-school self-centered geeks out there, the quite
discussed DHH. I will not enter into the merits of the reasons for the dubious
fame of David &amp;quot;DHH&amp;quot; Heinemeier Hansson, which basically stem from some of his
past posts on X/Twitter and some of his questionable ideas.&lt;/p&gt;&lt;p&gt;&lt;img src=&quot;/images/wm-vs-de.png&quot; alt=&quot;The great fight between WMs and DEs&quot; /&gt;&lt;/p&gt;&lt;p&gt;I’m not interested in that here. I’m more interested in some spontaneous
thoughts about the hype (well, at least among the very restricted niche group of
Linux desktop fans) around this desktop flavor. It is not something new; the
Hyprland UX is basically an &lt;a href=&quot;https://i3wm.org/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;i3&lt;/a&gt;-like
&lt;a href=&quot;https://en.wikipedia.org/wiki/Tiling_window_manager&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;tiling window manager&lt;/a&gt; with steroids,
based on the current non-Xorg incarnation of Wayland, with a few whistles and
bells.&lt;/p&gt;&lt;p&gt;I have been a long-time Linux desktop user since the 90s, and a tiling window
manager (specifically one of the suckless incarnations, &lt;a href=&quot;https://awesomewm.org/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Awesome WM&lt;/a&gt;) has been my
main desktop for quite a few years. Some years ago, I abandoned such a paradigm
when I finally realized that a pure tiling window manager is a great idea until
it isn't. Basically, most of its &lt;em&gt;pros&lt;/em&gt; (one application per virtual desktop, easy
tiling on big displays, keyboard-driven navigation) can be easily replicated in
a capable desktop environment like any current Gnome version. This has the big
advantage of being ready for use right after installation and of being easily
and fully customizable via plugins. The &lt;em&gt;cons&lt;/em&gt; of a tiling WM are always present,
based on workflows, and there are generally no easy workarounds. The biggest is
the need to find tricks and third-party tools to solve use cases that are not
always trivial (or worse, that are trivial on a DE instead).&lt;/p&gt;&lt;p&gt;A DE has the indisputable advantage of including all batteries for widgets and
customization tools, whereas most (all) WMs require third-party tooling to
manage many disparate configuration snippets, such as Bluetooth, Wi-Fi, hot-plug devices,
auto-sensing of beamers, dynamic multi-display, fast binding of container apps,
accessibility featues, and many others. Too often, also, such WMs require using a
command-line tool or a workaround to perform tasks that are simply part of the common DE experience.&lt;/p&gt;&lt;p&gt;I also remember the pain of using the multiwindow GUI of GrassGis under Awesome,
which was at the time just another type of application under a floating window
manager, instead. When an application opens a new window for every new module
used, well, the UX could become a nightmare under a tiling WM, if you are not
using a 43-inch display. The same goes for virtualized desktops, too: when the
guest and host collide for keyboard use, continuous control switching can
rapidly lead to madness. That’s just a pair of examples to conclude that the
coolness of a desktop implementation is often a matter of perspective and
personal workflows, and I constantly found that a mandatory tiling WM paradigm
is simply less flexible in some practical cases.&lt;/p&gt;&lt;p&gt;To be honest, I find the Omarchy UX to be the typical incarnation of a canonical
WM-based interface for fresh Linux desktop users. Such users are divided into
two classes:&lt;/p&gt;&lt;ul&gt;&lt;li&gt;&lt;p&gt;The class of people who are searching for an exact replica of Windows/macOS
GUI. A no-hope group of people: if something has to look like Windows, have
exactly the same policies and the same applications and icons, even, well,
probably they should stay with Windows: simple and clean. They are the most
critical and vocal complaining users for whom the Linux-on-the-desktop era will
never come.&lt;/p&gt;&lt;/li&gt;&lt;li&gt;&lt;p&gt;The class of people who look for something radically different and discover
the keyboard-driven interfaces as something almost magical (without fully
realizing that such an experience can be replicated easily and mostly by using
environment shortcuts and some simple plugins). A trivial secret, I would say.
They are the most enthusiastic about this kind of desktop, but also regular
distro-hoppers (yes, it is an offense for me: distro-hopping is for gamers, not
for workers who need to complete primary tasks daily) from time to time; more often,
they will never admit they are simply playing around, and solving auto-inflicted problems
is part of their game.&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;p&gt;Of course, the WM-based desktop paradigm still has its own use cases,
which I group under a very few  limited cases:&lt;/p&gt;&lt;ul&gt;&lt;li&gt;&lt;p&gt;You are using a strictly roll-on distribution, such as Debian sid/unstable,
Arch, Guix, Gentoo, and many other in-development flavors of other mainstream
distributions, including Fedora and OpenSUSE. On such distributions, avoiding
desktop environments reduces the likelihood of encountering temporary problems
after daily upgrades, as some transient (in the order of days/weeks) breakages
can occur. But who is the user of such platforms today? Seriously, I think only
someone actively involved in development and testing should be interested in
such distributions. Today, most desktop apps are distributed as containerized
packages via one of the multiple available hubs for Flatpak, AppImage, Snap,
Docker/Podman. I can’t see the practical advantage of using an unstable
distribution on a daily basis. If you think differently, dude, you have a
problem, and it is not the distribution you are using, but it is what you see in
the mirror. If you are not a YouTuber who needs to produce videos to monetize,
well, you are probably simply using the wrong distribution pointlessly (and
creating your own problems from time to time, which are perfectly part of the
roll-on experience, by manual).&lt;/p&gt;&lt;/li&gt;&lt;li&gt;&lt;p&gt;You are using an old, low-resource box with limited RAM and cores. A platform
that cannot simply be used with the current desktop environments. I seriously
doubt it could also be used for general computing, indeed. Nowadays, even a web
browser is simply a hungry hog on such platforms. I mean, a dual-core box with 4 GB of
RAM that could be more than 15 years old. If this is your platform, well, a
window manager is perfectly legitimate, but it probably couldn't be used with
Hyprland either. But to do what? I mean, but for installing it and telling it to
friends...&lt;/p&gt;&lt;/li&gt;&lt;li&gt;&lt;p&gt;You are an old-style lazy geek, anchored to your own configurations, refined
over dozens of years, with very few reasons to change. That’s perfectly
legitimate, but most of those configurations could probably be out of date. I
know, you are still adjusting your Modelines in your Xorg configuration. Well,
dude, probably it’s time to come down the tree in the forest.&lt;/p&gt;&lt;/li&gt;&lt;/ul&gt;&lt;p&gt;Of course, I also tried to install Omarchy on an old box of mine (an 11-year-old
Lenovo ThinkPad L540 with a dual-core i5 and 8GB RAM) that runs perfectly with
the current Debian 13 and Gnome 48. Sadly, it was not even booted to install:
just a dark screen. Good, but not too good, dudes.&lt;/p&gt;&lt;p&gt;And this leads me to the elephant-in-the-room argument for this post. Most users
need stability and, occasionally, up-to-date applications. The average users
need certainty that they can easily install an OS on most platforms and have a
stable UX for a decently long period (let’s say 2-5 years without any
reinstallation in between). The more users, the more stability. The simpler, the
more effective, too.  And that’s the real point most devs (or wannabe experts)
have probably missed in the meantime.
The desktop is a mere tool; it should not require an expertise addiction.&lt;/p&gt;&lt;p&gt;It is not a matter of DE vs WM, but of homogeneity and generality versus good,
but not enough for all. If one has to rediscover the warm water to manage a
configurable tool that, in a DE, is a click-n-point away, it is a failure
in general UX. Of course, even DEs are far from perfect, but too often, WM UX is
far from even being basically complete.&lt;/p&gt;&lt;p&gt;For instance, I can easily manage my full clipboard history with inter-session
persistence thanks to a simple Gnome plugin (i.e., Clipboard Indicator). There is no
equivalent widget in most WMs, but they need to use a third-party tool to
provide something that is almost equivalent, but often incomplete. Well,
Houston, we have a problem! That’s just an example, but the general approach is
clear: if one has to constantly sacrifice immediate, good-enough implementations
to adopt half-finished tools or workarounds to solve basic GUI workflows, WMs
become not accelerators of productivity but defective implementations, and that
has been my constant experience in that regard with WMs. At some point, one has
to set priorities, and after years, my priority has become not to waste time
reinventing the wheel for desktop GUIs. Sorry, guys. There is more than one way
to implement a desktop interface, but many of them can simply become a pain
because they are not flexible enough or incomplete, resulting in continuous
adjustings and workarounds to have something decently working.&lt;/p&gt;&lt;p&gt;And yes, this is another damn opinionated post about
the current &lt;em&gt;Year of Linux on Desktop&lt;/em&gt;. Don't take it too seriously...&lt;/p&gt;</content></entry><entry><title>Breaking dependencies on BigCos and a US centric IT world</title><id>https://lovergine.com/breaking-dependencies-on-bigcos-and-a-us-centric-it-world.html</id><author><name>Francesco P. Lovergine</name><email>mbox@lovergine.com</email></author><updated>2025-05-17T15:00:00Z</updated><link href="https://lovergine.com/breaking-dependencies-on-bigcos-and-a-us-centric-it-world.html" rel="alternate" /><content type="html">&lt;p&gt;I recently read some interesting articles (see [1,2]) by Bert Hubert about
IaaS and SaaS in the EU, which are generally considered cloud computing at
large. He has quite a deep understanding of such topics, and the reading is
enjoyable and triggered a few reflections.&lt;/p&gt;&lt;p&gt;The problem could be analyzed as a vast version of the &lt;a href=&quot;https://en.wikipedia.org/wiki/IndieWeb&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;indie web&lt;/a&gt;
movement.
Ensuring independence from a handful of big companies, all geolocated out of the
continent and possibly subject to the inconstancies of a humoral government, as
in current times, is a duty not only for individuals (who should also protect
themselves locally), but also for whole countries.&lt;/p&gt;&lt;p&gt;First of all, it is evident enough that Europe is not that bad about
infrastructure. In multiple countries on this continent, there are quite a few
big companies that have nothing to envy, such as Amazon, Google, or Microsoft,
for their capabilities and critical mass. There are already companies with
multi-region data centres, a good level of automation, and high SLAs. Of course,
they are mid-range companies, not monsters with country-size balances.&lt;/p&gt;&lt;p&gt;It is already perfectly possible [3] to depend on EU-based standard capabilities,
including email services or file storage, which represent a primary part of
common cloud services. What is truly missing is a capable enough set of
web-based cloud personal productivity software based in Europe, which would be
comparable to  Google Workspace or Microsoft 365, including video conferencing
and instant messaging.&lt;/p&gt;&lt;p&gt;Other important services could be Youtube equivalent, but what it is evident for
me is that such kind of services are also available at small scale, what it is
really missing is a sizeable financial effort to fund consistently projects that
already exist in the FOSS/indie web ecosystem and the will of doing that instead
of paying new consortia to develop from scratch some new solutions. Europe is
full of brilliant people and companies that are just waiting for that.&lt;/p&gt;&lt;p&gt;As a personal example, by manual, at work, the central management recently
abandoned the whole idea of maintaining fully &lt;em&gt;on-premise&lt;/em&gt; systems to move some
key services to the cloud for the entire national research network. Our
non-specialistic needs were quite average profile: shared storage, email system,
the usual personal productivity tools like Microsoft Office, and a
teleconferencing/webinar system. The same could be enough to cover the digital
needs of most of the companies and bodies out there in Italy. The result has
been a national contract with Microsoft for MS365 and a more limited contract
with Cytrix for GotoMeeting/Webinar. IMHO nothing is transcendental, something
that could be implemented with a decent pumping of money to scale up a
multi-datacenter on-premise solution to have full redundancy and an equally
decent dedicated team, with many more possible features and capabilities
available, at the end of the day. Maybe the only actual key point was the
availability of MS Office, which is the only severe lock-in source for most
users (and also why Google Workspace here has no hope of being considered). But
even in the past, we paid a lot of money for desktop multi-licenses in any case,
without any additional cloud solution.&lt;/p&gt;&lt;p&gt;I will not deal deeply with these incomprehensible dependencies on a single
application: in my honest opinion, a lot of users are simply too lazy to refuse
such a kind of lock-in, even if they depend on a very limited number of features
of such an application. There are at least three different alternative desktop
programs that most users could use successfully, but we still see Microsoft
Office as the holy grail (and I would also add that its web version has an
embarrassing UX).&lt;/p&gt;&lt;p&gt;Anyway, so what? That has been a precise abdication to autonomy of digital
services and choices, whose consequence will be for sure visible in the future:
loose of internal tech skills, missing investiments in FOSS alternatives, lack
of human resources growing, and missed diversifications of solution providers
(now both located in USA, not even Europe).&lt;/p&gt;&lt;p&gt;Seriously? We did not even have to externalize such services, only to reasonably
invest in the right direction for additional human resources and infrastructure
improvement instead of paying fees. In the last thirty years, I have not even
seen a cent directly paid by my organization for FOSS projects used daily by
tons of us, except for some rare fees for conference participation and sometimes
indirect payments for people that incidentally worked on FOSS project during
their daily job.&lt;/p&gt;&lt;p&gt;Let me be quite pessimistic about the actual intention of European bodies to
find a concrete way of improving continental clouds, which could be perfectly
viable instead. Until this moment, I have seen only an exaggerated capability of
defining rules for the IT ecosystems that are not always sensed and also often
misapplied. I saw instead a great lobbying capability by the well-known BigCos
that have sustained their own interests and de-potentiated any past effort about
this subject (hey, Gaia-X, yes, I'm talking about you).&lt;/p&gt;&lt;p&gt;Is this a lost war? I don't know, but I still don't see concrete signals of
changes in European policies about digital innovation, except for a big
regulation effort that does not change our full dependency on a handful of US
companies. I have more hope in individual actions, but of course, as in the case
of FOSS, they require a high level of awareness, which I still see in a limited
measure: just compare the number of Mastodon accounts against Meta socials,
TikTok, or Twitter/X ones. Maybe we are now at the same level of FOSS as about 30
years ago: a few visionaries and geeks see the problem and act, and most
people will follow. Or at least, I hope so.&lt;/p&gt;&lt;h2 id=&quot;references&quot;&gt;References&lt;/h2&gt;&lt;ol&gt;&lt;li&gt;&lt;a href=&quot;https://berthub.eu/articles/posts/now-how-to-get-that-european-cloud/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;But how to get to that European cloud?&lt;/a&gt;&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://berthub.eu/articles/posts/the-european-cloud-ladder/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;The (European) cloud ladder: from virtual server to MS 365&lt;/a&gt;&lt;/li&gt;&lt;li&gt;&lt;a href=&quot;https://european-alternatives.eu/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt; European alternatives for digital products&lt;/a&gt;&lt;/li&gt;&lt;/ol&gt;</content></entry><entry><title>About languages and tools: the walking dead and other legends</title><id>https://lovergine.com/about-languages-and-tools-the-walking-dead-and-other-legends.html</id><author><name>Francesco P. Lovergine</name><email>mbox@lovergine.com</email></author><updated>2025-05-08T13:00:00Z</updated><link href="https://lovergine.com/about-languages-and-tools-the-walking-dead-and-other-legends.html" rel="alternate" /><content type="html">&lt;p&gt;I'm writing this post to react to one of the many articles and threads about the
presumed death of this or that programming language, library, framework, or
tool. What that article was about and who wrote it is secondary. I could
synthesize my idea by citing a well-known joke by Mark Twain: &amp;quot;The rumors about
my death are greatly exaggerated.&amp;quot;&lt;/p&gt;&lt;p&gt;Let me use a &lt;em&gt;synecdoche&lt;/em&gt; rhetorical expedient and limit what follows to
programming languages. Of course, this post is not absolutely limited to them;
it could be applied with little effort to libraries, tools, frameworks, content
management systems, and many other tools of common use among developers.&lt;/p&gt;&lt;p&gt;&lt;img src=&quot;/images/walking-dead-coding.png&quot; alt=&quot;The Walking Dead coders&quot; /&gt;&lt;/p&gt;&lt;p&gt;Any developer who has been around enough knows that the death hoaxes about the
end of a programming language are a common refrain that returns almost every
year for most of them. The nude truth about programming languages is that
developers follow trends and fashions. Job recruitments influence such trends,
as well as some application categories and technologies that appear from time to
time.  Without any specific preference, there are currently tons of languages
that still have significant (or even not so large, but still sustainable)
communities that passed in the rearguard because their trendy momentum has been
in the past. A lot of people mistake the end of convulse development periods
(or, even worse, the absence of headlines on major tech news sites) for death. I
could cite many such products that were considered dead a long time ago and
still see one or more releases per year. From stability to death, it is often a
matter of points of view.&lt;/p&gt;&lt;p&gt;I would not refer to Fortran, Cobol, Ada, Prolog, Lisp, etc., which have been
around for 60-70 years. For me, all those are clearly niche languages that still
have their own use in specific domains, and that has been true at least in the
last 40 or 50 years. In most cases, they are not simply updated for features or
even able to manage common applications or programming patterns of the modern
era.&lt;/p&gt;&lt;p&gt;Who would try to write a web framework in Fortran or Cobol? Oh well, you
probably don't know  &lt;a href=&quot;https://fortran.io/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Fortran.io&lt;/a&gt;,
an MVC web framework written in Fortran90. Or even any of the full-stack web
frameworks written in Cobol.
So, it is better to say that for such languages, some applications are purely
intellectual challenges, drafts, and sketches that no one would seriously(?)
consider in production environments. That does not imply that such products are
at a dead end, but only that they are not considered for those applications (but
could be for other ones).&lt;/p&gt;&lt;p&gt;But for them, there are even more recent languages that gained the stage about
20-30 years ago or less, such as Java, Perl, Ruby, or PHP, which are still in
use in production environments but in declining popularity. A special case is
C/C++ and its variants, which most consider low-level languages at a dead-end
but are actively used in many application domains. Today, Rust is considered by
general rumors to be their natural replacement, but again, there is no evidence
that it will truly be so in the future. Often, in the past, what appeared to be
an ineluctable success in a certain period revealed a pure illusion to be
replaced by the next language of dreams.&lt;/p&gt;&lt;p&gt;So what? A dose of sane realism is genuinely required. Developers are voluble
and suffer from early love like teenagers. What today seems like the way to go
could only reveal a dazzle a few years (or even months) later. Being a bit
conservative could help, but the whole &lt;em&gt;silver bullet idea&lt;/em&gt; for tooling is
auto-lesionism. There is not one ring to rule them all. Simply, there is not a
single language to win in all fields, and the skill of being able to switch
comfortably among multiple ones (possibly finding the most helpful for a
specific goal or application domain) is the true superpower of a developer.
That's specifically true if such languages expose different programming patterns
and abstractions. All the rest is for gossip and opinions. Sometimes, a specific
package or framework declares the convenience of using the language (do you
remember the whole Ruby-on-Rails momentum?), as well as the existence of a very
specific language feature (e.g., Erlang efficiency in concurrency). That is the
true basis of a reasoned choice for an implementation.&lt;/p&gt;&lt;p&gt;That said, the only concrete problem nowadays is the job market. A learning plan
that includes only good-but-old, as well as for the opposite, only too recent
tools, could be equally wrong. The job opportunities could be equally few for
both. It seems that the most convenient language is one with a vast community,
but unfortunately, that could be a transient status: at the very beginning of
the third millennium, it seemed that Perl was the language of choice for web
applications, which became clearly not the case just a few years later. So what?
Well, a grain of salt is due in any case, but what seems like the current
primary choice could become a language of the past in just a few years.&lt;/p&gt;&lt;p&gt;A complete developer should at least know at a non-trivial level a
system programming language (e.g., C/C++, or Rust), as well as Python,
JavaScript, and possibly also a pure functional one (e.g, Scheme, Clojure, or
Haskell). Of course, moving from a non-trivial level to the guru level is a
matter of time and experience, and it could also never happen in practice for
each of them.  The more languages and programming paradigms you dominate,
the better for you.&lt;/p&gt;&lt;p&gt;What will be the next dead language in the near future that still seems to be
the current Big Thing? I have some suspicions, but I keep them to myself.&lt;/p&gt;</content></entry><entry><title>The shattered Internet</title><id>https://lovergine.com/the-shattered-internet.html</id><author><name>Francesco P. Lovergine</name><email>mbox@lovergine.com</email></author><updated>2024-08-02T13:00:00Z</updated><link href="https://lovergine.com/the-shattered-internet.html" rel="alternate" /><content type="html">&lt;p&gt;I recently finished reading &lt;a href=&quot;https://www.bollatiboringhieri.it/libri/vittorio-bertola-internet-fatta-a-pezzi-9788833942018/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;a book published one year
ago&lt;/a&gt;,
written by &lt;a href=&quot;https://bertola.eu/&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Vittorio Bertola&lt;/a&gt; and &lt;a href=&quot;https://en.wikipedia.org/wiki/Stefano_Quintarelli&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;Stefano Quintarelli&lt;/a&gt;.
Unfortunately, it is only available in Italian, but its title perfectly encloses all the topics
it covers: &lt;em&gt;The shattered Internet: digital sovereignty, nationalisms, and big
techs&lt;/em&gt;.  Like me, Vittorio and Stefano are among the relatively few early users and
participants of the primeval internet network of the 90s, even before the World
Wide Web was conceived. This book is a disenchanted and realistic travel in the
story of the &lt;em&gt;Big Network&lt;/em&gt; and how it has become a broken dream today in many
respects.&lt;/p&gt;&lt;p&gt;Thinking about it, it also shares some of the reasons why I started this
self-hosted blog recently. At the end of this post, one could also consider
that this site and the whole &lt;em&gt;indie web&lt;/em&gt; movement make little sense altogether.
Simply, they represent another unrealistic attempt to return to the origin.
In short, it's just a daydream. Maybe, or maybe not.&lt;/p&gt;&lt;p&gt;The Internet has been conceived from the beginning as a great, unified,
worldwide and resilient web of neutral connections based on open technical
standards and cooperation among developers and participants to allow
end-to-end communications all over the world, without discrimination.  At its very
beginning, in the middle of the 90s, it appeared to be a realized dream to the
most tech-savvy people.&lt;/p&gt;&lt;p&gt;Unfortunately, reality later started to appear in all
its hard truth.  The world is not neutral and equal for all human beings, and
there are multiple drivers of inequality and diversity. Moreover, human groups
tend to create private &lt;em&gt;walled gardens&lt;/em&gt; with deep moats among themselves, often
for the mere interests of the few.&lt;/p&gt;&lt;p&gt;Nowadays, there are at least two great sources of fragmentation for the
Internet, because of its own worldwide success. Nationalisms (and let me also say
different ways of seeing life, values, and our society itself) and the creation
of an oligopoly of a few big companies that dominate the network. Companies are
interested in making a profit and maintaining their walled gardens with millions
of users-customers locked in there.
This is not something new, but it is
a big problem when companies have balances that are more outstanding than those of many countries.&lt;/p&gt;&lt;p&gt;These centrifugal thrusts are shattering every day more the dream of the
big, unique and pacific network.
Internet users are more and more closed in limited bubbles, because
of their nationalities and cultures or the profit plans of the big corps.&lt;/p&gt;&lt;p&gt;Note that - as the book's authors - I don't think that the occidental US-centric
world has the correct/absolute answers for that. In many cases, I cannot share some ideas and
values considered &lt;em&gt;standard thinking&lt;/em&gt; overseas. I don't even know
if the tentative regulation policies here in Europe will succeed in creating
a better and respectful network.&lt;/p&gt;&lt;p&gt;Moreover, in many countries the Internet is limited and under the control/monitoring of central authorities, and I'm not
talking only about North Korea, China, Russia, Iran, or other nations with some known issues
in accessing the network. As we all discovered in the immediate past, even the so-called free
democracies &lt;a href=&quot;https://en.wikipedia.org/wiki/Edward_Snowden&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;show their fallacies&lt;/a&gt; from time to time.&lt;/p&gt;&lt;p&gt;Anyway, as tech-savvy individuals, we have the right and duty to escape as much as possible
from the mainstream short-field vision of the network, by diversifying and
avoiding the walled gardens, as well as
the &lt;a href=&quot;https://en.wikipedia.org/wiki/Pens%C3%A9e_unique&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;&lt;em&gt;unique thought&lt;/em&gt;&lt;/a&gt; for the
&lt;a href=&quot;https://en.wikipedia.org/wiki/The_End_of_History_and_the_Last_Man&quot; target=&quot;_blank&quot; rel=&quot;noopener noreferrer&quot;&gt;evolution of the society&lt;/a&gt;.&lt;/p&gt;&lt;p&gt;It is a matter of freedom and equality for all of us, even if it is wishful thinking.
And above all, even if many people out there do not care and are willing to give up
their privacy and freedom, too.&lt;/p&gt;&lt;pre&gt;&lt;code&gt;Wake up, Neo...
The Matrix has you...
Follow the white rabbit...
Knock, knock, Neo.&lt;/code&gt;&lt;/pre&gt;</content></entry><entry><title>Are distributions still relevant?</title><id>https://lovergine.com/are-distributions-still-relevant.html</id><author><name>Francesco P. Lovergine</name><email>mbox@lovergine.com</email></author><updated>2024-07-29T20:00:00Z</updated><link href="https://lovergine.com/are-distributions-still-relevant.html" rel="alternate" /><content type="html">&lt;p&gt;In principle and the traditional vision, the roles were clear enough. Upstream
developers had to create and support their own projects, including multiple
libraries, tools and modules, possibly for multiple operating systems.
Distribution maintainers had the responsibility of collecting a significant software
set, porting on various architectures, choosing versions that work well together
for each piece of software, patching for coherence and well-established
policies, eventually providing a build and installation system for the end
users. At the end of the day, a quite complicated and articulated work that
many people out there do for fun, others as a full-time job.&lt;/p&gt;&lt;p&gt;Some distributions have been around since the
beginning of the 90s and still release new versions regularly, including Debian GNU/Linux,
an ecosystem where I have lived and collaborated for almost 25 years.
That was an ideal workflow, managed differently and with diverse goals
by multiple non-profit associations and companies, including the Debian project.
Distribution aimed to have the most stable and affordable daily-use system,
especially for servers and enterprise ecosystems.&lt;/p&gt;&lt;p&gt;That was until 15 years ago or less, when virtual machines and later containers
changed the games and the whole cloud computing revolution started. In
prospective, that has not been the only driver for the change. Another important
aspect has been the great relevance that &lt;em&gt;dynamic languages&lt;/em&gt; and their ecosystems
assumed during the same period.&lt;/p&gt;&lt;h2 id=&quot;the-new-world-of-hubs&quot;&gt;The new world of hubs&lt;/h2&gt;&lt;p&gt;Hubs, hubs everywhere and for anyone. For programming languages, as well as for
containers and virtual machines. Starting from Perl and its CPAN archive, all
currently used languages have their own ecosystems of packages/modules hosted
on some third-party delivery networks. Most modern applications are based on
the distributed efforts of thousands of developers who create and maintain
thousands of tiny or large modules to solve some very specific goals, which
inevitably live in their respective language hubs.&lt;/p&gt;&lt;p&gt;Your latest applications, almost for sure, could only exist thanks to dozens - or
hundreds - of &lt;code&gt;include/require/use&lt;/code&gt; clauses written in some of the prime-time dynamic languages
that currently ride the wave of popularity. Sub-modules that are often developed by
small independent teams or even single developers, packages that are mostly
&lt;em&gt;open source&lt;/em&gt; and come with no warranties for their use and destination.&lt;/p&gt;&lt;h2 id=&quot;hubs-for-developers&quot;&gt;Hubs for developers&lt;/h2&gt;&lt;p&gt;In recent years, developers learned to distribute their own laptops with their applications: let me
simplify. Instead of creating minimal, well-engineered traditional packages for some target platforms
and eventually installers, we are now distributing container images on some cloud computing resources
with all required software piled up within them. If those images are based on Docker, Podman,
Lilipod or Apptainer it is absolutely secondary. Often, most applications are written
in a dynamic language and install gigs of dependency modules and libraries (often
in multiple versions) altogether. Giant software blobs stacked on top of some
very tiny operating system layer. All that started from &lt;em&gt;continuous integration&lt;/em&gt;
platforms for testing and development and exploded in a plethora of subsystems and tools used for
deployment to the end users.&lt;/p&gt;&lt;h2 id=&quot;hubs-for-end-users&quot;&gt;Hubs for end-users&lt;/h2&gt;&lt;p&gt;Talking about common users, thanks to new container-based systems thought for that - such as
flatpak, snap, or appimage - even ordinary users can now use programs that are not more
strictly dependent on distribution package managers. Those are the equivalent of Windows
installers in the Linux environments for end-users. Installing new programs or sub-systems is now
simplified: no building from sources, backporting and other more technical workflows. Just install
and update the latest product, kindly made available by multiple upstream teams, in terms
of some containerized image with only a minor runtime overhead.&lt;/p&gt;&lt;p&gt;It seems like a new ideal development and user world for Linux ecosystems, at least apparently.
Probably it is so for advanced and self-aware users, but ...&lt;/p&gt;&lt;p&gt;&lt;em&gt;I felt a great disturbance in the Force, as if millions of voices suddenly cried out in terror and
were suddenly silenced. I fear something terrible has happened (Obi-Wan Kenobi).&lt;/em&gt;&lt;/p&gt;&lt;h2 id=&quot;the-splintering-of-the-linux-ecosystems&quot;&gt;The splintering of the Linux ecosystems&lt;/h2&gt;&lt;p&gt;The Linux ecosystem has always been extremely fragmented from the point of view of a Windows (or Macos) user.
There are too many distros, too many desktop environments, and too many programs that often do almost the same things,
but differently in some regards, always in the name of freedom of choice. However, all this is nothing
in respect with the future. Distributions are going to be always less and less relevant for running
applications in the cloud and even on the user's desktop. For instance, in the case of Ubuntu and many derivatives distributions, even a big
part of the desktop system is now based on &lt;em&gt;snap&lt;/em&gt;, and its Ubuntu-specific containerized hub.
Soon, all distributions could become very compact core systems with most of the system applications moved
onto multiple external hubs with different frequencies of update.&lt;/p&gt;&lt;h2 id=&quot;who-is-responsible-of-the-whole-supply-chain&quot;&gt;Who is responsible of the whole supply chain?&lt;/h2&gt;&lt;p&gt;At the moment, one of the main challenges for the security of applications - that have become more and more
complicated and dependent on &lt;em&gt;a plethora&lt;/em&gt; of different third-party software - is certifying the whole
supply chain. A &lt;em&gt;software bill of materials&lt;/em&gt; is nowadays required in multiple contexts, but guess what?
A splintering of the whole software stack management responsibility among multiple third-party hubs, development
teams, and end-users is a game changer.
As a user or developer, you will be &lt;em&gt;directly&lt;/em&gt; responsible for updating all
your applications and keeping them stable. Not more your distribution, but you.
Most hubs do not have clear, well-established, and just-in-time policies for security updates.
In most cases, it is a task to be managed by every development team in order to re-collecting all
required pieces - in a consistent way - and rebuilding containerized images when needed.&lt;/p&gt;&lt;p&gt;Sure, there are &lt;em&gt;continuous integration&lt;/em&gt; workflows available in some cases. Still, I don't know so many
application teams that are seriously interested in timely reacting on security reports, starting from
some obscure CVEs, patching, updating multiple source trees and even ensuring to not break anything
by accurate testing platforms. A grunt work done by distribution secteams until now.&lt;/p&gt;&lt;p&gt;Even if basic container images were &lt;em&gt;bricks&lt;/em&gt; accurately managed and updated, the final build and
deployment of application images is a different matter, and while the current approach could
be helpful to have the most recent software installed for the end-users desktops, I'm pretty sure
that most of the thousands of embedded devices that now populate our homes are in pathetic
conditions for security. Many of them have opaque and short-term supports, and the use
of multi-hub sources can only render the whole thing worse: we all are potentially living with multiple
little time bombs permanently connected to the net.&lt;/p&gt;&lt;p&gt;This has been so since the very beginnig in the embedded environments, now we are changing the things
in the same way for desktops and cloud applications.&lt;/p&gt;&lt;p&gt;&lt;em&gt;Once you start down the dark path, forever will it dominate your destiny (Yoda)&lt;/em&gt;&lt;/p&gt;&lt;p&gt;We live in interesting times...&lt;/p&gt;</content></entry></feed>