I's K1R5TY5, or Kirsty if you prefer. I's here to... well, read the 'About Me' page if you please. Thank you. Believe me, it's more interesting than my online shopping history.
In this blog, you'll discover how everything is part of a "ONE." Yes, with a capital "O." As in... "everything is connected." Sounds cliché, doesn't it? But sometimes the deepest truths are hiding in plain sight. And yes, that's life!
By the way, here comes the good stuff: the past and the future were written thousands of years ago. Yes, I know, sounds like a cheap novel. But still, people keep saying "History repeats itself"... what a coincidence!
Anyway, on this blog, you'll find lots of seemingly useless information, but if you pay attention, you'll discover how we're all part of "The Grand Design" and how it's time to wake up. Wake up... to what, exactly? Well, that's the million-dollar question.
That's something you'll have to figure out yourself. You see, here I'll be like your alarm clock, providing you with different melodies to help you wake up. After that, I won't know what to say. The truth is, nobody knows, and that's the fun part. Or is it?
Ok. TLDR :D (About time!!!)
What you'll find here is what I found out there.
News: Some, truth is I won't cover technology news, but just talk about what I consider most important.
Analysis and Critiques: Of what? Mostly technology, both modern and ancient.
Music: It's not just music, or rather "noise in the form of music." That's right, I'm talking about the maximum musical expression created by humankind, EDM (electronic dance music).
Reflections: What reflections? You'll see. Whenever I have an epiphany, you'll see it here.
Arnd Bergmann, a kernel developer, said so after enjoying a morning yoga session at the "Open Source Summit Europe 2025." (Possibly, conflicting sources confirm this). In the presentation, titled "32-bit Linux in 2025 and beyond," he kicked things off by saying "32-bit Linux is obsolete," a statement that probably made some retro computer enthusiasts pretty upset. He also presented the current state of 32-bit platforms and the number of 32-bit devices added over time.
Arnd Bergmann explored the challenges, potential solutions, and the inevitable obsolescence of this paradigm in an ecosystem increasingly dominated by 64-bit architecture. While the presentation mainly focuses on embedded and ARM devices, it's important to note that the eradication of 32-bit from the kernel will also affect the desktop/server platform.
I don't think anyone is currently using x86 on a server... unless they're running a Windows XP container to play "Microsoft Pinball." Despite being able to expand memory capacity in linux_X86 with the 'Physical Address Extension,' it's still a pretty frustrating limitation. Today, 64 gigabytes is practically nothing, compared to systems offering 128 gigabytes, like the Lenovo 'ThinkCenter'. Let alone 32-bit software, which has become as rare as a dodo (yes!, we're talking about the bird, of course!).
However, the lack of 32-bit support will be a problem on the desktop. For example, Valve relies on 32-bit to keep existing in Linux. To start, their client is only available under the "i386" architecture in Debian. Plus, a significant part of their business lies in selling "Windows 32-bit games," which are often completely compatible on Linux thanks to the compatibility layer called "Proton." If 32-bit compatibility were removed from the x86_64 kernel, Valve would have no choice but to "fork" the kernel and mantain it for their SteamOS operating system. Can you imagine what that would be? Valve_Linux_x86: The kernel that lets you play Half-Life 1... forever; while waiting for the arrival of Half-Life 3!!! Awesome!
Beyond that, it's worth mentioning that most Linux distributions have already abandoned, or plan to abandon, 32-bit. In the case of Debian, they no longer release their distribution purely in "i386". The last "i386" version of Debian was "Debian 11 Bullseye." Despite this, Debian still maintains the "i386" libraries... just in case someone needs to revive the past or create a YouTube video titled "Challenge: Living 30 days on 32-bit Linux."
It's clear that 32-bit is going away in Linux; whether it's due to vulnerabilities, limitations, or bugs, 32-bit will disappear from the kernel. Without 32-bit compatibility, our options for running 32-bit applications will be limited to containers, virtual machines, or, worst case scenario, emulation. Thinking about it, emulating Xbox 360 and PlayStation 3 is already a challenge... imagine what it will be like to emulate x86 in 20 years. Maybe by then, we'll be able to enjoy Half-Life 2 emulating x86 the same way we can enjoy Half-Life 2 emulating Xbox 360 or PlayStation 3 today.
I was thinking about uploading some videos to my blog, so I started encoding some files with ffmpeg as a test. When checking the results after encoding, I noticed that hardware decoding in MPV was disabled.
I noticed this because my CPU utilization spiked to 40% just playing a video, which is normally only 3-7% when using hardware decoding. By default, in Debian, MPV decodes via software, which is nothing new to me. If you want MPV to use hardware decoding, you have to add a line in the mpv.conf file, which is "hwdec=vaapi".
Although it was already configured, the upgrade from Debian 12 to Debian 13 introduced a new version of MPV that, for some reason, disabled hardware decoding. I investigated a little, I pressed the "i" key in MPV to get detailed information and saw something new appear on the screen called "context." It turns out "context" refers to the API that the player uses to display the video on screen. Apparently, in the new version of MPV, it's using "Vulkan." My GPU supports Vulkan, which is why it seemed suspicious that it wasn't working.
The curious thing is that when I tested the 3 GPUs installed on my PC, only one couldn't use hardware decoding. So I started experimenting with the player until I managed to get my hardware to decode video again.
What I discovered is that, for some reason, my hardware is incompatible with the new version of MPV when using Vulkan. However, when I switched the API to OpenGL, everything went back to normal.
It turns out I had to add another line to the ~/.config/mpv/mpv.conf file and configure MPV like this:
|----------------| |#old_vaapi | | | |gpu-api=opengl | |hwdec=vaapi | | | |----------------|
This constant need to resolve issues after updates is frankly exhausting. While this is a small configuration error, nothing serious, having to fix something that wasn't broken is unacceptable.
Not to mention that, for some reason, some programs have stopped working after the update. And I'm talking about Debian, not Arch Linux.
This article is based purely on speculation and my personal opinion. I'm a future-predicting expert! Well, maybe not, but who cares, right?
Nvidia invested $5 billion USD in Intel, acquiring roughly 5% of the total shares, making it one of Intel's majority shareholders. This all became possible thanks to the illegal smuggling of RTX and Blackwell GPUs to China. Nvidia didn't know what to do with the extra money, so they decided to invest in Intel to put it to good use.
The official announcement states that Nvidia and Intel are collaborating to create AI and RTX graphics in SOCs. Translation: Nvidia is determined to get its chips into as many devices as possible, and Intel is desperate to find a way out of its current situation. In other words, they're planning a masterstroke to dominate the laptop market, sending the Arc line to /dev/null.
For data centers, the idea is to create custom chips for AI and integrate NVLink. Simply put, they want to make things go faster and prevent information from getting stuck. This technology will be groundbreaking and change the world of AI forever. Can you imagine asking ChatGPT to convert all your family photos into a Studio Ghibli style in just nanoseconds? Awesome!!!
The truth is, Intel is going through a… complicated time. Their CPUs can't compete with AMD, their GPUs can't find their footing, and now Nvidia, one of the giants of the sector, has bought their shares.
CPUs that can't compete with AMD and GPUs trying to find their way among giants… it's like trying to climb Mount Everest in flip-flops. And then, bam! Nvidia buys 5% of the shares. Now, Intel's future is so uncertain that we might as well expect them to become a retro computer manufacturing company and relaunch the legendary "8086" with Riva graphics.
Intel's graphics cards have never been the best on the market, let's be honest. But their commitment to open-source drivers has earned them a special place in the FOSS community. Something that even AMD hasn't been able to achieve.
But with Nvidia's arrival, the situation gets tense. The possibility of Intel's graphics being replaced by Nvidia is like suddenly returning to the era of AT PCs and having to forcibly buy a separate graphics card to have open-sources drivers.
The golden age of "gaming" is a distant memory. Many won't remember, but back then, our money went a lot further than it does today. I'm talking about the seventh generation - the Xbox 360 and PlayStation 3 era.
Back then, technology was advancing so rapidly that you could match or surpass consoles in terms of hardware in just 5 years, for roughly the same price or just a bit more than a console. I vividly remember the day I bought my beloved, "new" AMD graphics card, the HD 5670. That card allowed me to enjoy everything the seventh generation of video games had to offer on a budget I could scarcely have imagined.
I remember paying just $150 USD - it was insane. A $150 graphics card, "new," that could match or surpass consoles? Impossible today. $150 in 2011 is barely the equivalent of $200 now, and what can you buy with that amount of money? Not much.
Our options are very limited. We have Intel's Arc A750 with 16GB, Nvidia's RTX 3060 with 6GB, and what about AMD? Well, I haven't found anything. None of those options can match or surpass consoles. If we want to get value for our money, we have to buy used, which is essentially a "gamble." Or, we buy a console.
If you're wondering why the market is so broken, here's the reasons. First and foremost, it's our fault. Yes, it is. We haven't curbed our enthusiasm for the "PC Master Race," which has backfired on us. Another factor is that graphics cards do so much more now than they did in 2011. Today, in 2025, almost any GPU can handle parallel computing (GP-GPU), video encoding, and even accelerate computing for artificial intelligence, something impossible in 2011. These are the main drivers of GPU demand in the market.
Before, the PC gaming market was very small. In fact, at the time, the PC was declared dead as a platform, with mobile devices being the future. Don't believe me? You can find tons of articles declaring just that from back then.
Another reason is the rise of "PC E-sports". While Esports already existed, that console-centric community gradually started migrating to PC. Today, most important Esports tournaments take place on PC as the primary platform, and the leading games are all PC titles like Counter-Strike, League of Legends, DOTA 2, Valorant, Marvel Rivals, etc.
The ability to encode video and improvements in internet bandwidth have also created a new types of GPU buyer: "Professional Streamers" and "Content Creators". Many of these users have no idea what hardware they're buying; they simply purchase what's within their budget or buy "the best" they can afford. Believe it or not, this has also inflated the GPU market.
Let's not even mention that AMD, Intel, and Nvidia are dedicating their GPU production to their most important clients: AI data centers, essentially abandoning the "gaming" GPU consumer. Frankly, we can't compete in a market where Google, OpenAI, X (formerly Twitter), and Meta (formerly Facebook) have the purchasing power and take all the GPU production, leaving us, mere mortals, with barely adequate cards at ridiculously inflated prices.
For me, these are the reasons why the GPU market is completely broken. Overdemand and overconsumption have completely destroyed the landscape. We could also say that the greed and lack of competition from GPU manufacturers contributes to this disaster, as they ask for more money and give us less performance.
This is one of the main reasons why people like me have stopped buying PC hardware; honestly, it's not worth continuing to feed the greed of AMD or Nvidia while they insult us with hyper-inflated-priced products. Not to mention that current games don't justify the investment in hardware.
In the next post, I'll discuss games and how they also offer significantly less value than the hardware they run on.