I've thought about this too. But I can't even tell what would be the good default. At low load events seem nicer, but at high load polling seems necessary?
Q: the pictures of this case on the CoolerMaster site looks like there's a USB-C trigger to a Coaxial power connector, then another connector to the mainboard itself. Would this case work with a 20v coaxial power supply?
Q: What voltage/wattage does the Framework SFF computer need to power up?
A lot of the USB-PD power supplies only give watts, not the voltage provided. Your 60w power supply is probably either 12v/5A or 20v/3A, but the page itself doesn't say: https://frame.work/products/power-adapter
I have some 15w USB-C 5v/3A power supplies (that won't charge my Sony camera), a USB-PD battery that supposedly puts out 12v on the one USB-C port, and an Apple 30w USB-C ac adapter. My 2016 macbook charges on 5v, which is convenient. Framework forum posts indicate the laptops will charge on 5v 12v and 20v, but there was a problem with 15v.
Out of curiosity is that a hand crafted page? It is a nice flow with the animations and the graphic call-outs, etc. I'm personally not interested in the Obj-C code snippets but that overall layout of the page, the sidebar ToC, the minimalist design is just really stunning.
Also - how did you do the visuals in the "Gnarly Bits" section that split the page/components out into verticals? That is such an amazing way to display the internals of a thing like a page.
If these elements could be packaged into a blog theme for whatever blog hosting platforms are popular these days I bet you'd get a bunch of people to purchase. Nice work!
It's hand-crafted with TailwindCSS, vanilla HTML, and JS. I've worked on it for 2 months part-time. I was hoping it would get people's attention, and so it did. :)
The component split is done by hand in Figma with regular screenshots and cropping. I also used a plugin for skewing.
I think it's still annoying. When reading, I often look at where the scrollbar is to get an idea of how much reading is left. Right now, the scrollbar is so thin that I have to waste time looking for it. On a 34" monitor, it's very annoying.
Be real: Apple is not going to rewrite MacOs/iOs in Swift. Objective-C will always be there, offering faster and more robust features.
Just look at the Microsoft equivalent: yes, C# is good and all, but the hardcore Windows apps are still using (lightly-skinned) VC++ APIs - after almost 25 years since they started flogging .NET.
Swift is for the new rubes, bootcamp graduates and so on.
They're literally rewriting in Swift right now. Foundation is being rewritten entirely in Swift. All new code is in Swift. All new frameworks are Swift-only. They're using Swift from low level firmware on the Secure Enclave to apps. This is already real.
> That's a mult-year project in its very early stages, yet we're already almost 10 years into Swift (more than 10 years of Swift internally to Apple).
It has already shipped, replacing parts of Foundation in the 2023 OS versions. It continues to grow, and it's a rewrite, so it certainly proves your assertion wrong.
My other points were a bit hyperbolic. Feel free the replace "all" with "the vast majority of". Apple obviously still writes Obj-C in their existing Obj-C frameworks, and doesn't arbitrarily rewrite into Swift, but their internal barriers to use Swift are now almost entirely gone. And I can't think of an entirely new framework that wasn't Swift-only recently.
> Your point about Foundation was meaningless and the others just nits. Do you have an actual point to make?
My point, as always, is the truth. You said two false things, which you subsequently admitted were hyperbole. Truth is valuable in itself, and more important than "points", i.e., arguments or motives.
If I were to make a point, though, it's that Objective-C still has a very long life ahead of it, and its complete replacement, if that ever occurs, will be an arduous process, given the amount of extant Objective-C code in the operating systems and first-party apps (not to mention third-party apps). It's not just Objective-C either: C++ is also used quite a bit in the OS. Think of WebKit, for example.
You're not going to be able to hire many people with Objective-C experience nowadays. Engineers with 7 years of experience just writing iOS apps will very likely will have only used Swift in their work experience. I work with 2 of them now.
The article author is a solo indie dev. I'm a solo indie dev. We don't need to hire.
By the way, we could be hired, for the right compensation. Nonetheless, companies almost never try to recruit me, but they still whine about how "hard" it is to find ObjC developers. They're not even looking.
Besides, experienced engineers can learn a new programming language. Do you think that every engineer Apple hired before 2014 had Objective-C experience?
This is turning into a silly argument… but anyway there's a blogger who has been tracking the number of binaries written in the various languages (and appkit vs catalyst vs swiftui etc.) for years.
Sonoma is 13% Swift (up from 11% in Ventura), 53% Obj-C (down from 55% in Ventura). The priority actually appears to be eating away from the C/C++ parts of the codebase (currently 33%, down from 42% just two releases ago).
At this point you can't separate Swift from the rest of the system so cleanly. Since it's now included with the OS directly and linked to from many system libraries, including parts of Foundation which have been directly rewritten in Swift while maintaining ABI compatibility with Obj-C callers, virtually everything on the system that uses Apple's frameworks uses Swift to some degree.
Disclaimer: I work for Microsoft, though I wasn't there during the early days of .NET or Windows Longhorn.
C# was created as a Java competitor. Although it had great C interoperability, the underlying .NET Framework was still a VM-based runtime with a garbage collector and all the disadvantages that brings. You can probably find various articles (https://longhorn.ms/the-reset/ is one) discussing attempts to adopt C#/.NET code for Windows Longhorn, which ultimately had to be walked back completely. .NET wasn't purpose-built for writing OS components or working deep inside existing Windows code.
Apple learned from this and other examples. The Swift team actively works with teams at Apple deep in native code to make sure they can handle their use cases without performance penalties, and with minimal ergonomic issues.
The difference is really about what the stated goals of the language were/are.
Thank you to share. My guess: In the year 2000, it was impossible with current desktop computing power to use C# for OS internals. In 2024, it is a different story.
That's what MS said as well, when they were pushing C#. All Windows will be using safe code! Still waiting... Another example is Mozilla and Rust - hell, I wouldn't be surprised if there was still Netscape code somewhere in the bowels of FF/TB!
Sure, Apple cares less about backward compatibility, but still, it's unlikely Objective-C is going anywhere, under the hood.
One of the most complex apps that Microsoft produces is Visual Studio. It is currently a hybrid of C++ and C#. I suppose that almost all new features are written in C# where possible. Why won't Apply follow the same path? The developer productivity in Swift must be 10x compared to Objective C. To be clear: I write this post as someone who has infinite love for optimization of native code. However, in many situtations, it is simply more "dev efficient" to write code in a managed (VM) langauge. Thoughts?
Apple does use Swift in their IDE, Xcode. Several years ago they rewrote the text editor component in Swift. It’s taken them a while to get all the features back that the old one had, and has had a fair amount of bugs as well. I often wonder why they didn’t just leave it in ObjC and add the new features they wanted to add, like the minimap or sticky declaration heaters.
I definitely wouldn’t call Swift a 10x improvement in efficiency, and I like coding in Swift. I do advent of code in it each year, but spend a fair amount of time just fighting with the compiler–after all these years, it still emits strange or just flat out incorrect diagnostics.
I concur. I'm 10k LOC deep into a SwiftUI app (Absolutely no clue how much that works out to be in Objective C + UIKit equivalent code), and one of the most frustrating things (after all the stuff you can't do without a PhD in apple internals) is how piss poor Swift errors are. I've changed a line of code in one file, and then another, completely unrelated one stopped compiling. Most frequently, it's something about how checking the file took too long and it should be broken up (which you will learn, really means you have a small error somewhere in said file and swift isn't in a sharing mood)
When I switch between C or C++ and a (non-deterministic) VM-based language like Java or C#, it feels like 10x. The IDEs are way more advanced, including (for me) the #1 all-important: debugging. For me (a mere mortal, average programmer), the fact that null pointer exceptions are clear in Java/C# is a huge gain compared "core dump" in C or C++. Going further, I am sure many would say the same kind of productivity speed-up for C or C++ to Python or Ruby.
I have a bit of code I use for ARM Cortex devices where I can trap bus errors. Most of the time I can recover the program counter where that happened. And use addr2line to get the file and line number. I've heard game developers talk about doing that sort of thing as well.
I would think if the C/C++ developers didn't have their head up their **[1] that could be a standard out of the box feature. There isn't any reason a program couldn't spit to stderr, 'seg_fault: file boots.c, line 1043'
In C++ I'm dubious you couldn't throw an exception instead of dumping.
[1] Got rid of frame pointers because they were sure that would make their dog slow C++ compilers run faster. Voice over. But it didn't make them faster. It made programs impossible to profile.
Sure, we should do both in the same sense that we should floss twice a day and we should stop kicking children. One of them is more urgent and makes a bigger difference, so efforts should be focused on that one.
That seems like a good example because I can't see how resources are being kept away from "stop child kicking" in order to promote flossing. Doing both is entirely reasonable.
The resource that is in conflict is political attention. The more politicians and voters focus on EVs, the less is done to improve the walkability and public transit in our cities. EVs distract us from the real enemy: traffic.
Traffic is bad, but fossil traffic is strictly worse than EV traffic. So again, it seems possible to advocate both less traffic and that the remaining traffic should be better.
Perfect is the enemy of good. Even the Bay Area cannot get together and make meaningful improvements to non-car commuting infrastructure. I recall how many years it took just to get the dang bus lane from Richmond district to downtown. I will take any incremental positive changes I can get.
We can do both, of course. But making a sufficiently global change to our ecological dystopian future involves some sociologically hard problems:
Human population, the sociology of capitalism, industrialised consumerism, the unscalability of communities problem (both established structural debt and newer dysfunctional debt) and the consumerist voting fallacy ("I will vote with my money").
If there were fewer people, if there were less industrial-scale consumerism, if communities were scalable, if consumers were in fact globally engaged as purchasers and voting citizens...
I have heard an environmental scientist say that the order of responsible choices from worst to best is:
= buy an ICE vehicle
= buy a zero-emission EV
= DO NOT buy a vehicle
Of all the difficult sociological challenges that could certainly bring change, it seems that those likeliest to change are the human population (fewer) and zero-emission vehicles (more).
For some reason it makes me think of the "very hungry caterpillar" children's book - not because it's about a worm - but because it has little holes cut out of the pages you can stick your fingers through :P