Objective C was a far more elegant and powerful solution to the problem of extending C to be "object oriented" than C++.
The square brackets made very clear when you were engaging with SmallTalk style message passing semantics. But the language was still a full superset of C, allowing calling any legacy code you wanted. Or using C syntax for performance critical parts.
And for as much criticism as the language received, it was still the perfect fit for the original iPhone. Performance critical parts in C for the constrained hardware resources. But still allowed for rapid iteration on dynamic UIs for applications.
The hate Objective-C has received for its syntax is unreal. It’s a minimalistic, easy-to-navigate language, but people assumed Swift would be easier just because it looks more like Java and C#.
Meanwhile, Swift is such a mess that even its own creator said the following in an interview:
“Swift, the original idea was factor complexity (…) massively failed, in my opinion (…) Swift has turned into a gigantic, super complicated bag of special cases, special syntax, special stuff”
Well arguably he stated the whole thing since Chris wanted it to be one language to rule them all from assembly to javascript. It was that moment I knew Swift is wrong from birth.
Now 10 years later, Apple is stuck with hundreds of engineers trying to improve the language, rewrite some of the API in Swift. And Apps as well. All with very little user benefits. This actually reminds me of Apple without Steve Jobs era.
Imagine if Apple had simply said we are going to use Objective-C for another 10 years on a wait and see approach. I think the decline in Apple's quality is simply because when Apple has too many resources, different teams are all trying to get resources and put something out for credit in Resume Driven development.
The ability to mix languages is really underrated I think. Today you can toss a ball of Objective-C/C, C++, and Swift at clang and it’ll figure it out and spit out a single binary. That’s kinda crazy, and it lets you use whichever makes the most sense for each component of your app and gives you the ability to leverage a staggering number of libraries between the four.
It’s a stark contrast to e.g. Android world where making use of languages/libraries outside of the JVM bubble is technically possible but not necessarily a good idea in many if not most situations due to the caveats involved.
That is by design, of course it doesn't work easily, people keep trying to fit a square peg into a round hole.
From the official NDK documentation.
"The NDK may not be appropriate for most novice Android programmers who need to use only Java code and framework APIs to develop their apps. However, the NDK can be useful for cases in which you need to do one or more of the following:
- Squeeze extra performance out of a device to achieve low latency or run computationally intensive applications, such as games or physics simulations.
- Reuse your own or other developers' C or C++ libraries.
"
Anyone that keeps not understanding that point of view naturally run into walls that Android team has no plans to ever change.
> But still allowed for rapid iteration on dynamic UIs for applications.
This is the aspect that IMO was most harmed by the transition to Swift - and then later to a much deeper extent by SwiftUI, which makes quickly refactoring UI code very painful.
It's a shame that Objective-C never really caught on outside of the NeXT/Apple ecosystem. User interfaces benefit greatly from dynamism, and all the UI toolkits I've used for C and C++ try to emulate message passing in one way or another: from WinAPI's SendMessage, to GTK and Qt's signals and slots.
I'm surprised the wider FOSS community didn't adopt the language. I've been building a GTK4 app recently, and the macro-heavy class boilerplate, C-style casting everywhere, and custom signaling mechanisms would all be far cleaner in Objective-C. It's easy to imagine glib and GTK as what could have been a FOSS parallel to Core Foundation and Cocoa.
If you like C, then Objective-C is definitely worth a look. You don't need a Mac to try it either [1].
I'm not sure. I've written so much GUI code over several decades, and I think dynamism is only slightly helpful. I've been writing a new GUI in TS (not at all ready for publicity yet) that aims to rethink GUIs from the ground up as if the 80s and 90s never happened, but with the benefit of hindsight, much like Go did with C. I've been meaning to do a proper write up on some of the innovations I think are genuine improvemenets over the status quo. I should probably do one at a time and start today instead of waiting until release like I planned. But in my GUI, dynamism is only needed in maybe one or two core places. I'm not sure it makes any use of the fact that JS has string keys (equiv of objc_msgSend/etc), and can probably be written in boringish C++ just fine, or maybe even boringish Go, although op overloading would clean up one or two APIs really nicely.
Very cool that you can use the XCode interface builder to build the UI.
I wonder why this type of style hasn't caught on with React and friends? It would be really nice to be able to have an AppKit-quality UI programmable in React or Svelte.
I know I know mobile blah blah. But lots of web apps are complicated enough to only be useful on a large screen, like Figma.
Indeed, the late-binding capabilities of ObjC make the data binding scheme used in Apple's Cocoa API so much easier. This is one of the things I miss the most when building GUI apps for other platforms. And you can still mix it with C++ (=> Objective-C++).
Definitely! I developed several mobile apps in the early iPhone days. I had plenty of C and C++ commercial experience before that. I found Objective C much more pleasant to work with than C++.
It is a great language to build on but I think that it was really competing against c++ and unlike QT never had the corporate backing outside of Apple like many open source frameworks have.
You're bringing back memories. I ran WindowMaker on my Sun desktop (Solaris 2.6, I think?), back in the late 90's. I spent days customizing that system, compiling everything from source.
It’s a shame that C is the only native mainstream Abi stable language to catch on broadly. Objc or swift would be nice for library developers on win/lin even if it’s only for writing the entry points to their library.
And to some degree I echo the sentiment as well. While I was never in search of the divine programming language, I too felt that as Objective-C was being sunset and Swift was in ascendency, perhaps it was my time to also step out of my career — sunset myself so to speak.
Swift was something of a hard sell for me. It seem(s/ed) to borrow everything from every popular language allowing two different code bases to look as though they might have been written in two different languages (depending on the preferences/style of the two coders).
To be sure, a lot of the young engineers seem to have been drawn into the Apple ecosystem not because, like me, they grew up worshiping the user-interface brilliance of the Mac but because they are fans of the Swift language.
And like the author of the piece, I say, "Knock yourselves out, kids. Sayonara."
Having worked professionally with both C++ and Objective-C[0], I greatly prefer the latter. I'm not in love with either of them, but Objective-C feels so clean and well-thought out compared to the insanity of C++.
That's ok, C++23 is going to add another group of features that will be half-adopted at best in legacy codebases that will totally fix everything this time for real.
[0] in the same codebase via the unholy chimera that is Objective-C++
What a coincidence! I have been struggling to get Objective C going on Windows for a few days now to test out some ideas I had regarding a LINQ like language in C, and it' has been an ordeal.
The only worthwhile runtime available (that doesn't depend of MinGW or some such) is libobjc2 from GNUstep. I decided to not use the full GNUstep Foundation since it is clearly bloated and reflects a very Java-esque sensibility of the 90s, not to mention it depends on third party libraries like libcurl and whatnot. However, it turns out that the root class NSObject is defined in Foundation itself, and you need a root class to get anywhere with the language.
Fine, I decided, I'll write my own lightweight root class. That turned out to be so much more than I bargained for. In the end, I have one that supports manual reference counting and ARC (GC would've meant dealing with Boehm, one problem at a time). https://gist.github.com/forksnd/264d80858ee98e6d44e89e8972c0...
However, it is clearly not done. I can't invoke an arbitrary method on an object through the smalltalk syntax (get compilation error) and trying to do it through objc_msgSend fails silently. I was just trying to get the method tracing working, but it seems like it requires pthread (so Linux only then?).
It's insane how trying to get a minimal working workspace in this language is so difficult. No, I don't want a huge framework, all I want is inline SmallTalk in C. No wonder this language never found any footing outside of Apple's walled garden.
It requires MSYS2 on Windows which is now a whole new userspace to deal with. In addition, the MSYS2 compilers will output DWARF debug symbols (right?), which means none of the graphical debuggers (Visual Studio, RemedyBG, RADDbg, etc.) will work.
EDIT: Apparently, MSYS2's Clang has an option "-gcodeview" that can generate PDBs. Would try it tomorrow and see how it goes.
> I decided to not use the full GNUstep Foundation
This is why it has been an ordeal. I came to a similar impasse. It went away when I changed my mind. It's a little bloated I'll give you that, but it's not that bad. Certainly better than bootstrapping 10+ years worth of language features
> Certainly better than bootstrapping 10+ years worth of language features
That's the thing, I think ignoring those library features and rethinking the role of message passing OOP in plain C can actually lead to a much better language. But I do need a root class.
In an kind of ironic way, my graduation thesis was to port a particles engine from Objective-C/OpenGL into Windows 9X/Visual C++/OpenGL.
At the time my supervisor wanted to save the research done in 3D visualisation techniques with particles engine, developed on a NeXT Cube, and the Apple/NeXT acquisation was yet to happen.
The department was ramping down their use of NeXTSTEP, was it was clear the OpenSTEP efforts were also not going to save the company.
Thus several students got to rewrite applications from Objective-C into something else.
Had they known what would happen with NeXT's reverse acquisition of Apple, and OS X, most likely those thesis proposals would never happened in first place.
I agree. I had a couple of NeXT slabs from 1991 to 2000 or so. Objective C had some benefits. The NeXT Obj-C libraries were very usable and well thought out. I will grant that it had problems, the biggest of which was that it wasn't Windows 3.11 and it wasn't backed by Microsoft. The amount of pro-Microsoft press and propaganda was astonishing during that period.
One single NeXT workstation could buy a bunch of Windows 3.11 PCs, I know, my gradution thesis was porting software from Objective-C to C++, originall written on a Cube.
Immaterial with respect to judgements of Objective-C as a programming language. The cost of one single NeXT workstation could have bought a bunch of eggs, flour, sugar and milk to make pies and cakes, too.
You joke, but the price of the whole package is what dictactes buying decisions, not the greatness of Objective-C as programming language.
StepStone failed in the market, the authors moved into NeXT, and it isn't as if NeXT was doing that great, when Apple decided to acquire it.
Also lets not forget, during the early OS X days, Apple was so unsure Objective-C would be uptaken by the Mac OS developer community that they decided to ride the Java wave with their own implementation and JavaBridge.
Only after they saw Objective-C was being fully embraced by the developer community, did they drop their efforts to make Java a first party language on OS X development.
Personally I never liked having to type @ [] all over the place, even though I am a big Smalltalk fan, starting with Smalltalk/V for Windows 3.x.
And all the macros for basic types (YES, NO, BOOL, ...) always seemed a bit dirty way to achieve them, instead of the C++ way of having proper keywords.
Well, without Objective-C there would be no Java, nor C#, so there is that, as positive influence.
Yeah it's pretty clear that whoever wrote that article has never heard of Alan Kay and upon reading this comment would fruitlessly attempt to use Google or ChatGPT to figure out why he is relevant in this context.
(Seriously, if you feel the temptation to do that, don't waste your time. You won't get the nice quick answer you want. A better use of your time would be trying to translate 間 into English.)
Never used Objective-C or have done any serious programming (I work as a DE which mostly do data modelling, so I don't consider myself a serious programmer), but I feel the same as the author.
The work has bogged down whatever interest I have in programming, and the only sane solution is to somehow magically remove all financial burdens, go into a cabin in a mountain, and program my own projects and read some science, preferably with a dog and a fire.
I don't know if this will be helpful or inspirational or depressing, but your comment made me think of Paul Lutus, who pulled off at least some parts of the dream of living in a cabin and writing software. Here's an interview he did with Adam Gordon Bell on Bell's CoRecursive podcast (transcript and audio):
Cold take: Objective-C was best appreciated by using Interface Builder back before iPhones, CoreAnimation, and autolayout. As Interface Builder became less useful and more painful, the dynamic nature of the language became a liability instead of an essential attribute.
Interface Builder never should’ve been merged into Xcode. It’s only gone downhill since it was. Not that it was perfect as a standalone tool but it was better, particularly for Mac development.
I had read a book about Objective C in the 1980s and towards the end, was in with the Sybase crowd, so I did see NeXTStep (which had adopted it) back in its prime and instantly realized the potential --NeXTStep let "business analysts" visually create graphical database apps using pre-manufactured components written in Objective C (more than this really, there is also the concept of Responders which are like in-line services that can be arranged to create reusable value).
The problem Steve had with that nice ObjectiveC system was that the fools who ran Corporate America were from the generation that was still "shell shocked" from all of the vendor lock-in that went on during the wars between DEC, IBM, HP, Spurrows (Unisys) and smaller players like Nixdorf, none of which were software (or hardware) compatible with each other, meaning their customers were held hostage, often with incredibly expensive long term contracts on less than state-of-the-art machines. In the new Desktop era (that had just begun), for a short while at least, they wanted "cross-platform apps" that ran on universal hardware (think PC and Mac "clones") and that meant using object oriented frameworks and the only OOP systems that were mature enough to have platform-specific GUI libraries for DOS, Windows and Mac were some interesting Smalltalk packages and some exotic C-macro based systems like Neuron Data's Open Interface Toolkit and eventually Microsoft MFC (which was available for Classic Mac and all the versions of Microsoft Windows). Of course, as Windows took over the game, the need for cross-platform apps ended --just as Visual Studio, Microsoft FoxPro/Access and Visual BASIC were cleaning up and really locking everyone in for the decade. There was no more need for object oriented systems like NeXTStep or Smalltalk.
But then the WWW became a thing and NeXT made a bold move with WebObjects, which allowed their ObjectiveC visual tools (Project Builder) to output HTML in realtime. About the same time, Sun Microsystems launched Java with a really terrible UX library, but the promise of the portable Java interpreter (which was similar to the promise of UCSD p-code back in Steve's earlier Apple days) and that meant Java could "run (ugly looking but portable) code" in web browsers on any hardware. Oh happy days, a way to get back into Corporate America without the word Microsoft on your business cards. While the FIRST end of ObjectiveC is described in the article (before NeXT picked it up), Steve saw the potential to replace that ugly Java UX library with their WebObjects masterpiece and pivoted to rewrite WebObjects to output Java --and that was the SECOND end of ObjectiveC.
Somehow my Dad's old colleague working with Rear Admiral Grace Hopper on behalf of CalPERS was able to bail the shareholders of NeXT and Apple out. But NeXT was all about WebObjects by then and you couldn't run Macs with WebObjects (that would be a "thin client" which was a much maligned concept and offered no value-add for Apple), so that meant Avie Tevanian's team had to hold their nose and fuse the stinking classic Mac operating system into NeXTStep, breathing new life into ObjectiveC.
As that post-NeXT Mac operating system was "forked" to make battery powered phones, the idea of allowing developers to write apps in ObjectiveC became a liability. Suddenly they had a class of "fart apps" that were draining batteries, closing unexpectedly, heating up devices in people's hands, etc. --and to a casual user with limited computer experience, that looked like an iPhone problem. Apple had the incredible app-review process going on, but it's not the correct prescription for curing stupid programming. They needed a solution like Java (being used by their competitor and embroiled in litigation involving Sun/Oracle/Microsoft/Google) or even their old UCSD p-Code interpreter that ran Pascal, a language that had long been obsoleted by Ada, which everyone with half a brain hates. So..
> The end came for Objective-C in June of 2014, when Apple announced the debut of Swift
I wouldn't say that the THIRD death of Objective-C happened in 2014 nor is it upon us in 2025. Aside from a lot of existing code and performance reasons to use Objective-C, there is also the fact that the popular AI coder models were all trained on GitHub 2023 and the Swift code that's out there is from six different versions of the language, mostly written by people who were just learning it (so it's not leveraging Swift's value prop much). Cross-platform Linux/Mac code, like Moonlight for iOS, is also written in Objective-C (that had superior C++ integration than Swift until very recently). It's possible to remove ObjectiveC from Apple's product line, but it can't be a priority given the industry move to agentic-apps.
One of the coolest parts of Obj-C is that it's the only "dynamic" language that includes header files (AFAIK). This means you can use agentic coding tools way more effectively. Instead of having to shove the whole implementation into context, it can just work off the interfaces. Result is that even with just a few hundred K of context, the agent can reason about an entire large codebase. And because all interop between objects is based on interfaces and protocols, refactoring of implementations becomes trivial.
The square brackets made very clear when you were engaging with SmallTalk style message passing semantics. But the language was still a full superset of C, allowing calling any legacy code you wanted. Or using C syntax for performance critical parts.
And for as much criticism as the language received, it was still the perfect fit for the original iPhone. Performance critical parts in C for the constrained hardware resources. But still allowed for rapid iteration on dynamic UIs for applications.
Meanwhile, Swift is such a mess that even its own creator said the following in an interview:
“Swift, the original idea was factor complexity (…) massively failed, in my opinion (…) Swift has turned into a gigantic, super complicated bag of special cases, special syntax, special stuff”
https://x.com/krzyzanowskim/status/1812238141496934738
Now 10 years later, Apple is stuck with hundreds of engineers trying to improve the language, rewrite some of the API in Swift. And Apps as well. All with very little user benefits. This actually reminds me of Apple without Steve Jobs era.
Imagine if Apple had simply said we are going to use Objective-C for another 10 years on a wait and see approach. I think the decline in Apple's quality is simply because when Apple has too many resources, different teams are all trying to get resources and put something out for credit in Resume Driven development.
It’s a stark contrast to e.g. Android world where making use of languages/libraries outside of the JVM bubble is technically possible but not necessarily a good idea in many if not most situations due to the caveats involved.
From the official NDK documentation.
"The NDK may not be appropriate for most novice Android programmers who need to use only Java code and framework APIs to develop their apps. However, the NDK can be useful for cases in which you need to do one or more of the following:
- Squeeze extra performance out of a device to achieve low latency or run computationally intensive applications, such as games or physics simulations.
- Reuse your own or other developers' C or C++ libraries. "
Anyone that keeps not understanding that point of view naturally run into walls that Android team has no plans to ever change.
This is the aspect that IMO was most harmed by the transition to Swift - and then later to a much deeper extent by SwiftUI, which makes quickly refactoring UI code very painful.
I'm surprised the wider FOSS community didn't adopt the language. I've been building a GTK4 app recently, and the macro-heavy class boilerplate, C-style casting everywhere, and custom signaling mechanisms would all be far cleaner in Objective-C. It's easy to imagine glib and GTK as what could have been a FOSS parallel to Core Foundation and Cocoa.
If you like C, then Objective-C is definitely worth a look. You don't need a Mac to try it either [1].
[1] https://github.com/gnustep/libobjc2
I'm not sure. I've written so much GUI code over several decades, and I think dynamism is only slightly helpful. I've been writing a new GUI in TS (not at all ready for publicity yet) that aims to rethink GUIs from the ground up as if the 80s and 90s never happened, but with the benefit of hindsight, much like Go did with C. I've been meaning to do a proper write up on some of the innovations I think are genuine improvemenets over the status quo. I should probably do one at a time and start today instead of waiting until release like I planned. But in my GUI, dynamism is only needed in maybe one or two core places. I'm not sure it makes any use of the fact that JS has string keys (equiv of objc_msgSend/etc), and can probably be written in boringish C++ just fine, or maybe even boringish Go, although op overloading would clean up one or two APIs really nicely.
It does look interesting though it lacks most modern DX, which means its adoption is going to be limited I imagine
[0]: https://www.cappuccino.dev/learn/
I remember that the company behind it was called "280 South". They seem to have opensourced it before they shut down.
I wonder why this type of style hasn't caught on with React and friends? It would be really nice to be able to have an AppKit-quality UI programmable in React or Svelte.
I know I know mobile blah blah. But lots of web apps are complicated enough to only be useful on a large screen, like Figma.
https://cs.gmu.edu/~sean/stuff/java-objc.html
They were often used together though.
There is only OS ABI, and the ABI of C compilers tend to overlap with the OS ABI, on the cases where the OS was written in C.
This is easily visible outside the UNIX ecosystem.
And to some degree I echo the sentiment as well. While I was never in search of the divine programming language, I too felt that as Objective-C was being sunset and Swift was in ascendency, perhaps it was my time to also step out of my career — sunset myself so to speak.
Swift was something of a hard sell for me. It seem(s/ed) to borrow everything from every popular language allowing two different code bases to look as though they might have been written in two different languages (depending on the preferences/style of the two coders).
To be sure, a lot of the young engineers seem to have been drawn into the Apple ecosystem not because, like me, they grew up worshiping the user-interface brilliance of the Mac but because they are fans of the Swift language.
And like the author of the piece, I say, "Knock yourselves out, kids. Sayonara."
That's ok, C++23 is going to add another group of features that will be half-adopted at best in legacy codebases that will totally fix everything this time for real.
[0] in the same codebase via the unholy chimera that is Objective-C++
The only worthwhile runtime available (that doesn't depend of MinGW or some such) is libobjc2 from GNUstep. I decided to not use the full GNUstep Foundation since it is clearly bloated and reflects a very Java-esque sensibility of the 90s, not to mention it depends on third party libraries like libcurl and whatnot. However, it turns out that the root class NSObject is defined in Foundation itself, and you need a root class to get anywhere with the language.
Fine, I decided, I'll write my own lightweight root class. That turned out to be so much more than I bargained for. In the end, I have one that supports manual reference counting and ARC (GC would've meant dealing with Boehm, one problem at a time). https://gist.github.com/forksnd/264d80858ee98e6d44e89e8972c0...
However, it is clearly not done. I can't invoke an arbitrary method on an object through the smalltalk syntax (get compilation error) and trying to do it through objc_msgSend fails silently. I was just trying to get the method tracing working, but it seems like it requires pthread (so Linux only then?).
It's insane how trying to get a minimal working workspace in this language is so difficult. No, I don't want a huge framework, all I want is inline SmallTalk in C. No wonder this language never found any footing outside of Apple's walled garden.
I certainly managed to use it for some test programs a number of years ago.
https://objfw.nil.im/home
https://github.com/ObjFW/ObjFW
EDIT: Apparently, MSYS2's Clang has an option "-gcodeview" that can generate PDBs. Would try it tomorrow and see how it goes.
This is why it has been an ordeal. I came to a similar impasse. It went away when I changed my mind. It's a little bloated I'll give you that, but it's not that bad. Certainly better than bootstrapping 10+ years worth of language features
That's the thing, I think ignoring those library features and rethinking the role of message passing OOP in plain C can actually lead to a much better language. But I do need a root class.
At the time my supervisor wanted to save the research done in 3D visualisation techniques with particles engine, developed on a NeXT Cube, and the Apple/NeXT acquisation was yet to happen.
The department was ramping down their use of NeXTSTEP, was it was clear the OpenSTEP efforts were also not going to save the company.
Thus several students got to rewrite applications from Objective-C into something else.
Had they known what would happen with NeXT's reverse acquisition of Apple, and OS X, most likely those thesis proposals would never happened in first place.
this is deeply uninformed, with bald prejudice added.
StepStone failed in the market, the authors moved into NeXT, and it isn't as if NeXT was doing that great, when Apple decided to acquire it.
Also lets not forget, during the early OS X days, Apple was so unsure Objective-C would be uptaken by the Mac OS developer community that they decided to ride the Java wave with their own implementation and JavaBridge.
Only after they saw Objective-C was being fully embraced by the developer community, did they drop their efforts to make Java a first party language on OS X development.
Personally I never liked having to type @ [] all over the place, even though I am a big Smalltalk fan, starting with Smalltalk/V for Windows 3.x.
And all the macros for basic types (YES, NO, BOOL, ...) always seemed a bit dirty way to achieve them, instead of the C++ way of having proper keywords.
Well, without Objective-C there would be no Java, nor C#, so there is that, as positive influence.
https://cs.gmu.edu/~sean/stuff/java-objc.html
https://en.wikipedia.org/wiki/Distributed_Objects_Everywhere
(Seriously, if you feel the temptation to do that, don't waste your time. You won't get the nice quick answer you want. A better use of your time would be trying to translate 間 into English.)
The work has bogged down whatever interest I have in programming, and the only sane solution is to somehow magically remove all financial burdens, go into a cabin in a mountain, and program my own projects and read some science, preferably with a dog and a fire.
https://corecursive.com/remote-developer/
To give just a taste, here's a forum post that quotes a few highlights from that CoRecursive episode:
https://retrocomputingforum.com/t/remote-developer-1970s-app...
The problem Steve had with that nice ObjectiveC system was that the fools who ran Corporate America were from the generation that was still "shell shocked" from all of the vendor lock-in that went on during the wars between DEC, IBM, HP, Spurrows (Unisys) and smaller players like Nixdorf, none of which were software (or hardware) compatible with each other, meaning their customers were held hostage, often with incredibly expensive long term contracts on less than state-of-the-art machines. In the new Desktop era (that had just begun), for a short while at least, they wanted "cross-platform apps" that ran on universal hardware (think PC and Mac "clones") and that meant using object oriented frameworks and the only OOP systems that were mature enough to have platform-specific GUI libraries for DOS, Windows and Mac were some interesting Smalltalk packages and some exotic C-macro based systems like Neuron Data's Open Interface Toolkit and eventually Microsoft MFC (which was available for Classic Mac and all the versions of Microsoft Windows). Of course, as Windows took over the game, the need for cross-platform apps ended --just as Visual Studio, Microsoft FoxPro/Access and Visual BASIC were cleaning up and really locking everyone in for the decade. There was no more need for object oriented systems like NeXTStep or Smalltalk.
But then the WWW became a thing and NeXT made a bold move with WebObjects, which allowed their ObjectiveC visual tools (Project Builder) to output HTML in realtime. About the same time, Sun Microsystems launched Java with a really terrible UX library, but the promise of the portable Java interpreter (which was similar to the promise of UCSD p-code back in Steve's earlier Apple days) and that meant Java could "run (ugly looking but portable) code" in web browsers on any hardware. Oh happy days, a way to get back into Corporate America without the word Microsoft on your business cards. While the FIRST end of ObjectiveC is described in the article (before NeXT picked it up), Steve saw the potential to replace that ugly Java UX library with their WebObjects masterpiece and pivoted to rewrite WebObjects to output Java --and that was the SECOND end of ObjectiveC.
Somehow my Dad's old colleague working with Rear Admiral Grace Hopper on behalf of CalPERS was able to bail the shareholders of NeXT and Apple out. But NeXT was all about WebObjects by then and you couldn't run Macs with WebObjects (that would be a "thin client" which was a much maligned concept and offered no value-add for Apple), so that meant Avie Tevanian's team had to hold their nose and fuse the stinking classic Mac operating system into NeXTStep, breathing new life into ObjectiveC.
As that post-NeXT Mac operating system was "forked" to make battery powered phones, the idea of allowing developers to write apps in ObjectiveC became a liability. Suddenly they had a class of "fart apps" that were draining batteries, closing unexpectedly, heating up devices in people's hands, etc. --and to a casual user with limited computer experience, that looked like an iPhone problem. Apple had the incredible app-review process going on, but it's not the correct prescription for curing stupid programming. They needed a solution like Java (being used by their competitor and embroiled in litigation involving Sun/Oracle/Microsoft/Google) or even their old UCSD p-Code interpreter that ran Pascal, a language that had long been obsoleted by Ada, which everyone with half a brain hates. So..
> The end came for Objective-C in June of 2014, when Apple announced the debut of Swift
I wouldn't say that the THIRD death of Objective-C happened in 2014 nor is it upon us in 2025. Aside from a lot of existing code and performance reasons to use Objective-C, there is also the fact that the popular AI coder models were all trained on GitHub 2023 and the Swift code that's out there is from six different versions of the language, mostly written by people who were just learning it (so it's not leveraging Swift's value prop much). Cross-platform Linux/Mac code, like Moonlight for iOS, is also written in Objective-C (that had superior C++ integration than Swift until very recently). It's possible to remove ObjectiveC from Apple's product line, but it can't be a priority given the industry move to agentic-apps.