source code – Hackaday https://hackaday.com Fresh hacks every day Tue, 05 Nov 2024 06:24:12 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.2 156670177 I Installed Gentoo So You Don’t Havtoo https://hackaday.com/2024/11/04/i-installed-gentoo-so-you-dont-havtoo/ https://hackaday.com/2024/11/04/i-installed-gentoo-so-you-dont-havtoo/#comments Mon, 04 Nov 2024 15:00:22 +0000 https://hackaday.com/?p=729862 A popular expression in the Linux forums nowadays is noting that someone “uses Arch btw”, signifying that they have the technical chops to install and use Arch Linux, a distribution …read more]]>

A popular expression in the Linux forums nowadays is noting that someone “uses Arch btw”, signifying that they have the technical chops to install and use Arch Linux, a distribution designed to be cutting edge but that also has a reputation of being for advanced users only. Whether this meme was originally posted seriously or was started as a joke at the expense of some of the more socially unaware Linux users is up for debate. Either way, while it is true that Arch can be harder to install and configure than something like Debian or Fedora, thanks to excellent documentation and modern (but optional) install tools it’s no longer that much harder to run than either of these popular distributions.

For my money, the true mark of a Linux power user is the ability to install and configure Gentoo Linux and use it as a daily driver or as a way to breathe life into aging hardware. Gentoo requires much more configuration than any mainline distribution outside of things like Linux From Scratch, and has been my own technical white whale for nearly two decades now. I was finally able to harpoon this beast recently and hope that my story inspires some to try Gentoo while, at the same time, saving others the hassle.

A Long Process, in More Ways Than One

My first experience with Gentoo was in college at Clemson University in the late ’00s. The computing department there offered an official dual-boot image for any university-supported laptop at the time thanks to major effort from the Clemson Linux User Group, although the image contained the much-more-user-friendly Ubuntu alongside Windows. CLUG was largely responsible for helping me realize that I had options outside of Windows, and eventually I moved completely away from it and began using my own Linux-only installation. Being involved in a Linux community for the first time had me excited to learn about Linux beyond the confines of Ubuntu, though, and I quickly became the type of person featured in this relevant XKCD. So I fired up an old Pentium 4 Dell desktop that I had and attempted my first Gentoo installation.

For the uninitiated, the main thing that separates Gentoo from most other distributions is that it is source-based, meaning that users generally must compile the source code for all the software they want to use on their own machines rather than installing pre-compiled binaries from a repository. So, for a Gentoo installation, everything from the bootloader to the kernel to the desktop to the browser needs to be compiled when it is installed. This can take an extraordinary amount of time especially for underpowered machines, although its ability to customize compile options means that the ability to optimize software for specific computers will allow users to claim that time back when the software is actually used. At least, that’s the theory.

It didn’t work out too well for me and my Dell, though, largely because Dell of the era would put bottom-basement, obscure hardware in their budget computers which can make for a frustrating Linux experience even among the more user-friendly distributions due to a general lack of open-source drivers. I still hold a grudge against Dell for this practice in much the same way that I still refuse to use Nvidia graphics cards, but before I learned this lesson I spent weeks one summer in college with this Frankensteined computer, waiting for kernels and desktop environments to compile for days only to find out that there was something critical missing that broke my installations. I did get to a working desktop environment at one point, but made a mistake with it along the way and decided, based on my Debian experiences, that re-installing the operating system was the way to go rather than actually fixing the mistake I had made. I never got back to a working desktop after that and eventually gave up.

This experience didn’t drive me away from Gentoo completely, though. It was always at the back of my mind during any new Linux install I performed, especially if I was doing so on underpowered hardware that could have benefited from Gentoo’s customization. I would try it occasionally again and again only to give up for similar reasons, but finally decided I had gained enough knowledge from my decades as a Debian user to give it a proper go. A lot has changed in the intervening years; in the days of yore an aspiring Gentoo user had to truly start at the ground up, even going as far as needing to compile a compiler. These days only Gentoo developers take these fundamental steps, providing end users with a “Stage 3” tarball which contains the core needed to install the rest of Gentoo.

Bringing Out The Best of Old Hardware

And I do have a piece of aging hardware that could potentially benefit from a Gentoo installation. My mid-2012 Macbook Pro (actually featured in this article) is still a fairly capable machine, especially since I only really need a computer these days for light Internet browsing and writing riveting Hackaday articles. Apple long ago dropped support for this machine in macOS meaning that it’s no longer a good idea to run its native operating system. In my opinion, though, these older, pre-butterfly Macs are still excellent Linux machines aside from minor issues like finding the correct WiFi drivers. (It also can’t run libreboot, but it’s worth noting that some Macs even older than mine can.) With all of that in mind I got to work compiling my first Linux kernel in years, hoping to save my old Macbook from an e-waste pile.

There’s a lot expected of a new Gentoo user even with modern amenities like the stage 3 tarball (and even then, you have to pick a stage file from a list of around 50 options), and although the handbooks provided are fairly comprehensive they can be confusing or misleading in places. (It’s certainly recommended to read the whole installation guide first and even perform a trial installation in a virtual machine before trying it on real hardware.) In addition to compiling most software from source (although some popular packages like Firefox, LibreOffice, and even the kernel itself are available as precompiled binaries now), Gentoo requires the user to configure what are called USE flags for each package which specify that package’s compile options. A global USE flag file is also maintained to do things like build GNOME, Bluetooth, even 32-bit support into every package, while specific package USE flags are maintained in other separate files. For example, when compiling GIMP, users can choose which image formats they want their installation of GIMP to support. There’s a second layer of complexity here too as certain dependencies for packages can be “masked” or forbidden from being installed by default, so the user will also need to understand why certain things are masked and manually unmask them if the risk is deemed acceptable.

One thing that Gentoo has pioneered in recent years is the use of what it calls distribution kernels. These are kernel configurations with sane defaults, meaning that that they’ll probably work for most users on most systems on the first try. From there, users can begin tweaking the kernel for their use case once they have a working installation, but they don’t have to do that leg work during the installation process anymore. Of course, in true Gentoo fashion, you can still go through the process of configuring the kernel manually during the install if you choose to.

Aside from compiling a kernel, Gentoo also requires the user to make other fundamental choices about their installation during the install process that most other major distributions don’t. Perhaps the biggest one is that the user has to choose an init system, the backbone of the operating system’s startup and service management systems. Generally most distributions decide for you, with most larger distributions like Debian, Fedora, and Arch going with systemd by default. Like anything in the Linux world, systemd is controversial for some, so there are alternatives with OpenRC being the one with the most acceptance in the Gentoo world. I started out with OpenRC in my installations but found a few pieces of software that I use regularly don’t play well with it, so I started my build over and now use systemd. The user also can select between a number of different bootloaders, and I chose the tried-and-true Grub seeing no compelling reason to change at the moment.

In addition, there’s no default desktop environment, so you’ll also need to choose between GNOME, KDE, XFCE, any other desktop environment, or among countless window managers. The choice to use X or Wayland is up to you as well. For what it’s worth, I can at least report that GNOME takes about three times as long to compile as the kernel itself does, so keep that in mind if you’re traveling this path after me.

It’s also possible you’ll need to install a number of drivers for hardware, some of which might be non-free and difficult to install in Gentoo while they might be included by default in distributions like Ubuntu. And, like everything else, they’ll need to be compiled and configured on your machine as well. For me specifically, Gentoo was missing the software to control the fans on my MacBook Pro, but this was pretty easy to install once I found it. There’s an additional headache here as well with the Broadcom Wi-Fi cards found in older Macs, which are notoriously difficult pieces of hardware to work with in the Linux world. I was eventually able to get Wi-Fi working on my MacBook Pro, but I also have an 11″ MacBook Air from the same era that has a marginally different wireless chipset that I still haven’t been able to get to work in Gentoo, giving me flashbacks to my experience with my old Dell circa 2007.

This level of granularity when building software and an overall installation is what gives Gentoo the possibility for highly optimized installations, as every package can be configured for the user’s exact use case for every package down to the kernel itself. It’s also a rolling release model similar to Arch, so in general the newest versions of software will be available for it as soon as possible while a Debian user might have to wait a year or two for the next stable release.

A Few Drawbacks

It’s not all upside, though. For those without a lot of Gentoo experience (including myself) it’s possible to do something like spend a day and a half compiling a kernel or desktop environment only to find out a critical feature wasn’t built, and then have to spend another day and a half compiling it again with the correct USE flags. Or to use the wrong stage file on the first try, or realize OpenRC won’t work as an init system for a specific use case, or having Grub inscrutably be unable to find the installation. Also, don’t expect Gentoo to be faster out-of-the-box than Debian or Fedora without a customization effort, either; for me Gentoo was actually slower than Debian in my benchmarks without a few kernel and package re-compiles. With enough persistence and research, though, it’s possible to squeeze every bit of processing power out of a computer this way.

Personally, I’m not sure I’m willing to go through the amount of effort to migrate my workstations (and especially my servers) to Gentoo because of how much extra configuration is required for often marginal performance gains thanks to the power and performance capabilities of modern hardware. Debian Stable will likely remain my workhorse for the time being for those machines, and I wouldn’t recommend anyone install Gentoo who doesn’t want to get into the weeds with their OS. But as a Linux hobbyist there’s a lot to be said for using other distributions that are a little more difficult to use than Debian or even Arch, although I’d certainly recommend using a tool like Clonezilla to make backups of your installation from time to time so if you do make the same mistakes I made in college you can more easily restore your system. For me, though, I still plan to keep Gentoo on my MacBook Pro since it’s the machine that I tinker with the most in the same way that a classic car enthusiast wants to keep their vehicle on the road and running as well as it did when it was new. It also lets me end forum posts with a sardonic “I use Gentoo, btw” to flex on the Arch users, which might be the most important thing of all.

]]>
https://hackaday.com/2024/11/04/i-installed-gentoo-so-you-dont-havtoo/feed/ 52 729862 gentoo
How A DOS Format Blunder Revealed Some Priceless Source Code https://hackaday.com/2024/05/25/how-a-dos-format-blunder-revealed-some-priceless-source-code/ https://hackaday.com/2024/05/25/how-a-dos-format-blunder-revealed-some-priceless-source-code/#comments Sat, 25 May 2024 20:00:09 +0000 https://hackaday.com/?p=681437 As those of us who worked in the consumer software world back when physical media was king can attest, when a master disc has been sent for duplication and distribution …read more]]>

As those of us who worked in the consumer software world back when physical media was king can attest, when a master disc has been sent for duplication and distribution there is no turning back from whatever code is in the hands of thousands of users. Usually such worries were confined to bugs or inadvertently sending out pre-release software versions, but [Lance Ewing] is here with the story of how Sierra On-Line once inadvertently released most of the source code for their game engine.

If you have some 720k floppy disk versions of the 1988 game Space Quest II, the first disk in the set appears to have nothing out of the ordinary, but a closer look reveals that the free space on the disk reported by DOS is greater than its used space. Diving in to the disk block contents with a hex editor reveals that many of the unused blocks in fact contain C code, and some further detective work allows the recovery of a not-quite complete set of source files for the company’s AGI, or adventure game interpreter. They had been left behind when the original master disk had been emptied by deleting them, rather than by formatting it afresh.

In commercial terms this would in 1988 have been something of a disaster for Sierra had it been discovered at the time, because it was the cornerstone of their success. As it was we’re told the code sat peacefully undetected until 2016, since when it has proved invaluable to those interested in computer game archaeology. Or did it? We’ll never know if a sharp-eyed competitor snagged it, and kept quiet.

Of course, these days, there are game engines that are open source. Some of them are very modern. Others… not so much.

]]>
https://hackaday.com/2024/05/25/how-a-dos-format-blunder-revealed-some-priceless-source-code/feed/ 14 681437 space-quest-featured
Source Code to the 1999 FPS Game Descent 3 Released https://hackaday.com/2024/04/17/source-code-to-the-1999-fps-game-descent-3-released/ https://hackaday.com/2024/04/17/source-code-to-the-1999-fps-game-descent-3-released/#comments Thu, 18 Apr 2024 02:00:37 +0000 https://hackaday.com/?p=674180 On April 16th of this year, [Kevin Bentley] released the source code to the Sci-Fi FPS game Descent 3. Originally released in 1999 for Windows, the game has you control …read more]]>

On April 16th of this year, [Kevin Bentley] released the source code to the Sci-Fi FPS game Descent 3. Originally released in 1999 for Windows, the game has you control a flying ship which you have to guide through both in- and outdoor environments, while shooting at robots that have been infected with an alien virus as you try to save the solar system. It was later also ported to Mac OS and Linux, but was considered a commercial flop due to low sales.

As one of the original developers, [Kevin] explains that one of the goals of this code release is to give the game a second life, by cleaning up the C++ code and using new APIs. Original proprietary audio and video libraries from Interplay were removed, which means that some work is required before one can build a fresh copy of the game from this code base. That said, the released code is the latest 1.5 patch level, with the Mac OS and Linux support. Even if the original Descent games weren’t your cup of tea, it’s still great to see games being preserved and updated like this.

Thanks to [Phil Ashby] for the tip.

]]>
https://hackaday.com/2024/04/17/source-code-to-the-1999-fps-game-descent-3-released/feed/ 48 674180 descent_3_screenshot
Diff Tool Knows What You Mean https://hackaday.com/2022/09/10/diff-tool-knows-what-you-mean/ https://hackaday.com/2022/09/10/diff-tool-knows-what-you-mean/#comments Sat, 10 Sep 2022 08:00:00 +0000 https://hackaday.com/?p=552189 We will admit to not being particularly artistic, but we do remember an art teacher telling us that sometimes it is better to draw what isn’t there instead of what’s …read more]]>

We will admit to not being particularly artistic, but we do remember an art teacher telling us that sometimes it is better to draw what isn’t there instead of what’s there — a concept known as negative space. [Wilfred] makes a similar point when explaining his “fantastic diff” tool called, appropriately, difftastic. He points out that when comparing two programs, the goal isn’t so much to determine what changed, but rather what stayed the same. The more you can identify as the same, the less you have to show as a change.

The tool compares source code in a smart way, assisted by tree-sitter which has many different languages already parsed, at least well enough for this purpose. According to [Wilfred’s] post the tool supports 44 different languages ranging from bash and YAML, Verilog to VHDL, and C++ to Rust, among others.

Of course, the tool by itself is worth taking note of. But the real gems in the write-up are things like tree-sitter and a lucid description of the algorithm (borrowed from autochrome) for working out the minimal set of changes.

The code is still under development and the output is not always as clear as he would like. Still, a pretty good tool and a great write-up on the development challenges.

Although Verilog and VHDL are a start, we really want diff for schematics. Oh, and PCB layouts, don’t forget those,e either.

]]>
https://hackaday.com/2022/09/10/diff-tool-knows-what-you-mean/feed/ 20 552189 diff
The Legend of Zelda: Decompiled https://hackaday.com/2021/12/24/the-legend-of-zelda-decompiled/ https://hackaday.com/2021/12/24/the-legend-of-zelda-decompiled/#comments Fri, 24 Dec 2021 15:00:00 +0000 https://hackaday.com/?p=511279 Keeping source code to programs closed is something that is generally frowned upon here for plenty of reasons. Closed source code is less secure and less customizable, but unfortunately we …read more]]>

Keeping source code to programs closed is something that is generally frowned upon here for plenty of reasons. Closed source code is less secure and less customizable, but unfortunately we won’t be able to convince everyone of the merits of open source code any time soon. On the other hand, it is possible to decompile some of those programs whose source remains behind locked doors in an attempt to better understand that code, and one of the more impressive examples of that of late is this project which has fully decompiled The Ocarina of Time.

To get started with the code for this project, one simply needs to clone the Git repository and then use a certain set of software tools (depending on the user’s operating system) to compile the ROM from the source code. From there, though, the world is your rupee-filled jar. Like we’ve seen from other decompiled games, any number of enhancements to the original game can be made including increasing the frame rate, improving the graphics, or otherwise adding flourishes that wouldn’t otherwise be there.

The creators of this project do point out that this is still a work-in-progress as only one of the 18 versions have been completed, but the fact that the source code they have been able to decompile builds a fully-working game when recompiled speaks to how far along it’s come. We’ve seen similar processes used for other games before that also help to illustrate how much improvement is possible when re-writing old games from their source code.

Thanks to [Lazarus] for the tip!

]]>
https://hackaday.com/2021/12/24/the-legend-of-zelda-decompiled/feed/ 13 511279 zelda-decompiled-main
Spell Checking Your Programming from the Linux Command Line https://hackaday.com/2021/05/29/spell-checking-your-programming-from-the-linux-command-line/ https://hackaday.com/2021/05/29/spell-checking-your-programming-from-the-linux-command-line/#comments Sun, 30 May 2021 05:00:00 +0000 https://hackaday.com/?p=480403 For most of us who didn’t do well in high school English class, spell checkers are a real game-changer. Sure, you can still swap a “to” and a “too,” but …read more]]>

For most of us who didn’t do well in high school English class, spell checkers are a real game-changer. Sure, you can still swap a “to” and a “too,” but a spell checker will catch a lot of typos. But what about in your source code? You usually don’t spell check source code and even if you did, the rules are funny. After all, “my_proejct” is a perfectly fine variable name, but you probably meant “my_project.” That’s where a program called typos comes in. It aims to be a spell checker for source code that is fast enough and with a low enough false positive rate that you can run it against changed code and reject spelling problems.

Sure, if “my_proejct” is a one-time typo, the compiler or interpreter will probably catch it. But it won’t catch comments and it also won’t catch something you spell wrong consistently. For that you need something like typos.

You can include a custom dictionary and also per-language dictionaries. It is aware of camel case and snake case and knows to ignore hex codes. The only thing we saw it doesn’t handle well is C-language escapes.

There are apparently other checkers out there and we learned about them from this project’s comparison grid. There’s misspell, codespell, and scspell. This is the tool we didn’t know we needed, but probably do.

If you are writing bash scripts and want to check their correctness there is shellcheck, which sounds like spell check but has a whole different function. If you want to brush up on your spelling, you can always hack a Speak ‘n Spell.

]]>
https://hackaday.com/2021/05/29/spell-checking-your-programming-from-the-linux-command-line/feed/ 17 480403 spell
Ask Hackaday: Is Windows XP Source Code Leak a Bad Thing? https://hackaday.com/2020/09/25/ask-hackaday-is-windows-xp-source-code-leak-a-bad-thing/ https://hackaday.com/2020/09/25/ask-hackaday-is-windows-xp-source-code-leak-a-bad-thing/#comments Fri, 25 Sep 2020 17:01:23 +0000 https://hackaday.com/?p=433817 News comes overnight that the Windows XP source code has been leaked. The Verge says they have “verified the material as legitimate” and that the leak also includes Windows Server …read more]]>

News comes overnight that the Windows XP source code has been leaked. The Verge says they have “verified the material as legitimate” and that the leak also includes Windows Server 2003 and some DOS and CE code as well. The thing is, it has now been more than six years since Microsoft dropped support for XP, does it really matter if the source code is made public?

The Poison Pill

As Erin Pinheiro pointed out in her excellent article on the Nintendo IP leak earlier this year (perhaps the best Joe Kim artwork of the year on that one, by the way), legitimate developers can’t really make use of leaked code since it opens them up to potential litigation. Microsoft has a formidable legal machine that would surely go after misuse of the code from a leak like this. Erin mentions in her article that just looking at the code is the danger zone for competitors.

Even if other software companies did look at the source code and implement their own improvements without crossing the legal line, how much is there still to gain? Surely companies with this kind of motivation would have reverse engineered the secret sauce of the long dead OS by now, right?

Spy vs. Spy

The next thing that comes to mind are the security implications. At the time of writing, statcount pegs Windows XP at a 0.82% market share which is still going to be a very large number of machines. Perhaps a better question to consider is what types of machines are still running it? I didn’t find any hard data to answer this question, however there are dedicated machines like MRIs that don’t have easy upgrade paths and still use the OS and there is an embedded version of XP that runs on point-of-sale, automated teller machines, set-top boxes, and other long-life hardware that are notorious for not being upgraded by their owners.

From both the whitehat and blackhat side, source code is a boon for chasing down vulnerabilities. Is there more to be gained by cracking the systems or submitting bug fixes? The OS is end of life, however Microsoft has shown that a big enough security threat still warrants a patch like they did with a remote desktop protocol vuln patch in May of 2019. I wonder if any of this code is still used in Windows 10, as that would make it a juicy tool for security researchers.

As for dangerous information in the leak, there have been some private keys found, like the NetMeeting root certificate. But its hard to say how much of a risk keys like this are due to the age of the software. You should stop using NetMeeting for high-security video conferencing if you haven’t already… it was end of life thirteen years ago so there’s nothing surprising there.

You Just Might Learn Something

I think the biggest news with a leak of code like this is the ability to learn from it. Why do people look at the source code of open source projects? Sure, you might be fixing a bug or adding a feature, but a lot times it’s to see how other coders are doing things. It’s the apprenticeship program of the digital age and having source code of long-dead projects both preserves how things were done for later research, and lets the curious superstars of tomorrow hone their skills at the shoulder of the masters.

Like a Museum Vouching for the Legitimacy of Artifacts

Why don’t company’s get out in front of this and publish end-of-life code as open source? This would vouch for the validity of the code. As it stands, how do you verify leaked code acquired from the more dimly lit corners of the Internet? Publishing the official source code for end of life projects preserves the history, something the Internet age has never given much thought to, but we should. We’ve heard the company promoting the message that Microsoft loves open source, here’s another great chance to show that by releasing the source code since it’s already out there from this leak. It would be a great step to do so now, and an even better one to take before leaks happen with future end of life products.

This is a pie-in-the-sky idea that we often trot out when we encounter stories of IoT companies that go out of business and brick their hardware on their way out. In those cases, the source code would allow users to roll their own back-end services that no longer exist, but Microsoft would be likely to frown on a “LibreWinXP” project based on their own code. It’s likely that the company still has a few long-term contracts to provide support for entities using XP hardware.

So What Do You Think?

This is Ask Hackaday so we want to know your take on this. When old source code leaks, is it a bad thing? Are there any compelling reasons for keeping the source code from projects that have seen their last sunset a secret? And now that the XP code is out there somewhere, what do you think may come for it? Weigh in below!

]]>
https://hackaday.com/2020/09/25/ask-hackaday-is-windows-xp-source-code-leak-a-bad-thing/feed/ 57 433817 windows-source-code-leak-featured