Steve Barnes' World of Happiness

A moment for Brad Cox.

I'm given to understand Dr Brad Cox died this month. I didn't know the name, but after learning he received the primary credit for creating Objective-C, I was instantly aware I knew much of his legacy. Cox developed the language in the 1980s and would later license it to NeXT, sending it on its journey toward its golden age as the main language used for macOS and iOS apps. When sitting down to learn from scratch how to develop for Apple platforms ten years ago (after a childhood with only a little AmigaBASIC, Hypertalk and HTML), Objective-C was the language whose concepts were the first more sophisticated prgramming concepts I struggled to grasp.

I suppose the most recognizably "flagship" trait of Objective-C is object-orientedness, which in hindsight seems so much like a fundamental right answer that two of the most commonly-used web programming languages, JavaScript and PHP – neither of which were originally designed with this concept at front of mind – have since evolved their own object-oriented tendencies.

There are other traits special to Objective-C which I imagine are more consciously appreciated by people who learned to program before it appeared. In my own hindsight, its syntax was also one of the strangest compared to that of other languages (for example, sending an object a "message" in square brackets, such as [waiter takeOrder]), but for a few years, it was my first and only real impression of the way many such things were done.

Since I started out with that, I've loved learning about computer history through reading current manuals, riffling through remnants of surviving communiqués and exchanged on the earlier web, and watching some of the videos churned up on YouTube, but it had never occurred to me to ask where this particular language came from. I'm sorry I missed you, Doctor Cox. That was quite a contribution.

WindowSwap.

Another place on the Internet which appeals to my sense for its proper use: "open a window" to watch a pre-recorded video from near a stranger's window.

The terms of service look official (despite being on Google Docs – I'm not sure why), but I'm guessing this is one person's work, and they seem to have gathered a number of high-quality submissions from diverse regions.

Flash: a legacy technology in both senses.

I knew this calendar date was significant somehow.

Some time ago, Adobe announced they'd end support for Flash, and today is the day. That means they won't be issuing further security updates and recommend users remove it from their systems. Flash's final hours.

In 2010, some seemed stunned to learn the newly-announced iPad, a device Steve Jobs had emphatically framed as a better device for browsing the web than a smartphone or a notebook, wouldn't support Flash. Steve articulated his thoughts in one of his few open letters, memorably called "Thoughts On Flash". His case in short was that native web technologies would soon provide similar functionality that all devices could implement according to open standards and with better performance.

He didn't even have to mention some of the pain points: "downloading Flash installer" had become a household verb phrase. A security issue meant wariness about patches, and a bug could crash your browser. A period method of delivering malware was to disguise your malicious product as a Flash installer, relying on users' muscle memory to download and run it without thinking.

In any case, Jobs knew what he was talking about: Apple had long since involved itself in developing open web standards, and its now-ubiquitous WebKit engine was well into adolescence. Today, creating interactive animations on the web using HTML standards is relatively fun, easy, and natively and reliably supported in browsers everywhere.

But in the early 2000s, that was closer to unheard of. Perhaps you could do something tricky with JavaScript, but web pages were still largely text and images; anything richer, even embedding a typical video file, meant a fair amount of explicit browser support or installing a third-party plug-in.

"Flash," near-synonymous with Macromedia before Adobe took it over, was a proprietary platform for rich, animatable, interactive stuff. Its coolness made its third-party-ness worth overlooking, and it became popular enough that it seemed reasonable to expect most would have it installed – or if they didn't, that they should. Occasionally, you'd find Flash was "required" because an entire site was created in it, using a single HTML page only as a wrapper.

A staple of the time was Homestar Runner (there's the link to the page is still live at as I type this – does it load for you?), the riffy cartoon family so enduring and beloved that I think they may still live off their merchandise sales. In addition to countless, fully-voiced animated sketches with easter eggs and interactive end cards, the site featured a complete spoof of an old Sierra game dubbed Peasant's Quest, and a three-part send-up of old text adventures called Thy Dungeonman. They weren't edge-pushing 3-D extravaganzas, but they were good, full games. I think they've mentioned they may convert things to HTML5 (some time ago), but Homestar has seemed so wedded to Flash that they've felt more like a single life, and I'm going to miss all of it.

That's what Flash was before the native web could ever have been it. It may have inspired Apple and other standards groups to get their act together and work toward the better web-based "future" in which we find ourselves, and that's a legacy not to be sneezed at.

Chicory, by the way.

Apparently it's a root I've recently discovered, not visually dissimilar to a carrot. What a humbly deceptive appearance for something so distinctive when ground, roasted and dissolved in hot water.

My first hot coffee beverage was consumed well into adulthood, on a quiet night in the passenger seat of a comfortable vehicle. The flavour of coffee is lovely (in chocolate or ice cream, for example), but the intense bitterness as a brewed drink confused me; it seemed more like self-torture than indulgence, yet was somehow a worldwide staple. Not until I rose regularly to work on little sleep did I come to embrace it, and not until years later did I begin perceiving my body's threshold for caffiene, not to mention the drink's ability to stain enamel or dentine.

Food and drink alternatives are worth an entire chapter, as easily scoffed as adopted depending on attitude. If I hadn't spent so much time with coffee, I might have appreciated chicory for itself, not as an alternative – and it wouldn't have "confused" me. The beverage has its own bitterness without cream or sugar, but not of the same overwhelming class. It has a flavour which is closer to the average of all tastes than coffee's is. Most importantly, its aftertaste is slightly sweet and flavourful, almost a trace of the "burnt-ness" of burnt sugar occurring naturally to it.

Barely caloric, not filling like hot chocolate – a comforting, unimposing beverage that quickly carved its own place for me. I think it deserves the same at large.

The syzygistic solstice.

I hope you'll join me in appreciating the rare natural significance of the solstice today, marked not only by the sun's extreme angle, but the near-alignment of Saturn and Jupiter as seen from Earth – something that's happened only since I formed on it (and I somehow missed it last time).

I use "seen" hypothetically, since clouds continue helpfully to blanket my own vantage – so I'm relying on my fellow humans to capture and share the moment, or at least avail their chance to view it for themselves.

Andrés Arrieta on Facebook's campaign.

Yesterday's post overviewed Facebook's campaign with its bizarre angle framing tracking unwitting users as virtuous, but left unanswered the question of what Facebook's staff is really thinking; an appropriate omission since I'm no expert on the personalized ad industry, but I'm enough of a beginner to assume it's about money derived from it. The main available source for insight seems to be this article from the Electronic Frontier Foundation.

Does Facebook not even know how it sounds?

Speaking of digital privacy.

MacRumors reports Facebook has just posted its second full-page ad in The Wall Street Journal, The New York Times and The Washington Post, starting with:

Apple vs. the free internet

Apple plans to roll out a forced software update that will change the internet as we know it — for the worse.

What are they talking about?

Apple has always subjected per-app privacy controls to explicit user approval. For example, you have to tap "allow" in a dialogue box before a new app can access your photos. In iOS 14.4 (likely), this scheme extends to the concept of tracking users across web sites owned by other companies, an exception which previously defaulted to an opt-out option.

The Internet's technical nature has allowed companies to accomplish and evolve user tracking to the point of nurturing an industry, often by exploiting web browser and networking features not designed for this purpose.

Because of this – notably – a company wishing to track a user hasn't typically sought their permission. Users often haven't even known this was happening, and companies typically haven't vocalized much beyond somewhat readable yet familiarly wordy and nebulous privacy policies. As this practice has grown, companies whose interests overlap with those of the user have been catching up, designing mitigating safeguards and measures with at least the expertise and creativity required for the tracking itself. To my memory, only then did user-tracking companies' voices begin to buzz more plainly, with a tone recognizable in these full-page ads.

And what was it this time, again? Apple versus "the free Internet"? For a dialogue box that only affects apps on the App Store while leaving the current web browsing experience untouched?

I think Facebook's argument is supposed to be this: Apple is, quote, "threatening" personalized ads; small businesses rely on personalized ads for revenue; therefore, let us band together against Apple for small businesses.

Apple isn't threatening personalized ads. They're explicitly – more explicitly than before, in fact – allowing users to grant permission to be tracked. The alarm-sounding translation happened entirely on Facebook's end, prompting questions that feel more closely coupled to the truth about whatever's going on here. What is it, Facebook? Do you expect users will say no? Why is that – are you not capable of explaining the benefits of personalized ads to them? You seem ready to explain to everyone but them – ready enough to spend whatever amount gets you full-page ads in high-profile newspapers.

I try to phrase gracefully, but for a dominant company with a respectable public relations staff, this seems so pathetic. Not only does Facebook's argument fall apart like smoked meat from a brisket, but I'm not even sure they realize how they sound. They might as well say, "the only solution we can think of is to track people without asking."

This from a company whose legacy is now largely one of grovelling contrition for massive negligence or misuse of users' data; a company whose developer keynote last year saw Mark himself gesture at the large-typed "The Future Is Private."

Needless to say, it reads less like Facebook cares about small businesses and more like they were scrabbling for a heartstring-tugging angle in the style of Harold Hill. Facebook's main campaign page even features a few small business owners demonstrating the same subtle panic Facebook presumably hopes to provoke generally, smearing the owners themselves with the same tastelessness. Not "please, iPhone users, consider opting in – you'll get personalized ads and could help us get ahead without much effort on your part; we'd really appreciate it" – but "please, everyone else, we need to force Apple to let Facebook track users without their permission – yes, please, support us, the small business."

Resurfaced with this campaign were remarks titled "A Path Forward For Privacy And Online Advertising," in which Erin Egan, Facebook's Chief Privacy Officer, elaborated on their technical history and stance with another virtuous claim: "… we continue to believe personalized ads and privacy can coexist."

Well, that sounds precisely like Apple's goal here; but we already know Facebook isn't keen on that. So what does Egan offer?

That’s why we’re investing in research and development of privacy-enhancing technologies. These technologies will help us achieve the value of personalized ads while using and sharing data securely in de-identified form.

What strikes me isn't the wording, but that sense of pattern: companies who don't sincerely respect the user have been on the record before as "investing in," "researching," and so on – meanwhile, companies who do have already done the research, and have been busy doing things, hence the conversation.

Indeed, the thesis of the article isn't the presentation of any kind of specific alternative, but a (just as familiar) call for "debate":

Should any one company decide for us where the balancing of equities should land when considering the pro-consumer and pro-competitive benefits of personalized advertising?

Well, Apple isn't deciding; they're doing what they can to ensure individual users decide, and only on Apple's own platform. No one has to use an iPhone, and some people do use iPhones because they believe Apple will furnish exactly this choice and control.

Why doesn't Facebook make its own phone, then, offering users the choice to flock instead to its benevolent garden? Well, it did:

AT&T, the exclusive U.S. carrier of the First, only reportedly sold over 15,000 units of the device, while both ReadWrite and Time named it among the biggest failures in the technology industry for 2013.

And I suppose that brings us back to today. I love social media in principle, and its rise was a time of excitement for me. But the more I hear since leaving, the more content I am to have opted out of Facebook entirely. It was a lot to lose, personally and – now, as an independent artist and developer – perhaps financially, but I don't want the help of a company harbouring this attitude in its leadership, let alone wearing it right on its shirt.

To cleanse the palate in wrapping up, I'll link this often-linked quote from Steve Jobs:

Privacy means people know what they’re signing up for, in plain English, and repeatedly. That’s what it means. I’m an optimist, I believe people are smart. And some people want to share more data than other people do. Ask them. Ask them every time. Make them tell you to stop asking them if they get tired of your asking them. Let them know precisely what you’re going to do with their data. That’s what we think.

Apple's not perfect (as Steve also conceded), but that paragraph still rings distilled and true as the words of timeless leaders. Mark Zuckerberg was right there in the audience, yet ten years on, even the notion of that respect seems absent from Facebook's own thinking. The company has advanced superficially but not fundamentally. I hope others make up their own minds, and whatever you conclude, call it to memory next campaign. And here's to an Internet in which you're only tracked if you want to be: dare I say, a freer one.

Craig Federighi on digital privacy.

More conferences are online this year, and the 10th Annual European Data Protection & Privacy Conference picked this American guest well.

To an Apple fan, much of this speech is repeat material (with the feeling of, "oh, Bernie Sanders thinks health care is a right? I wasn't sure"). Of course, health care is best regarded as a right, and Craig advocates users' privacy is equally so. In this speech, he does it with Jobsian-sounding common sense and efficiency, dipping into technical concepts only when needed.

Among those concepts are a recollection of Safari's pioneering blocking of third-party cookies, its recent advent of Intelligent Tracking Prevention in browsers, and the assertion that App Store apps will soon gain something similar – and could face expulsion for misbehaving. Craig also mentions that Apple has called for a GDPR-like digital privacy law in the United States, which I hadn't heard. (I can't say I think the dump truck's worth of "we use cookies" banners is a proud consequence, but I have the feeling Apple's proposition will reveal a better way.)

I write about this because I think it's important. It's great to enrich yourself by learning a little about encryption or whatever – but fostering an attitude of entitlement to privacy generally can influence software companies' choices, which could help shape the future like a gardener helps shape a plant. It seems analogous to Carl Sagan's feeling that a healthy democracy's citizens should understand the basic principles of science, such that the elite and government appointees aren't the only ones with a clear view on their own choices, their consequences, and their accountability.

Last decade, with the evolving web was at its least secure and most abusable, there seemed an attitude that privacy was eroding and the mature response was resignation or adaptation. Apple really has "thought different" about this from the beginning, starting with Steve Jobs. I'm not sure I'd be typing this myself if they hadn't. If they could inspire me to adjust my outlook, I think most people today can help inspire each other.

HomePod mini: quick first impressions.

I was a first-day adopter of the original HomePod, which was strange, because even amid a life of composing music, I'd seldom considered myself an audiophile: the seeming target consumer for this product. What mattered with music, I thought, was whether you could hear and distinguish the notes and sounds. With speech, whether you could make out the words. Inexpensive speakers and headphones were fine, and those who disagreed were like connoisseurs of wine who could be fooled by a blindfold.

Perhaps it wasn't that I deeply cared, but that I still wondered whether I'd been missing something. I had fully passed on the HomePod's spiritual progenitor, the iPod Hi-Fi, which, while I gathered wasn't popular, I had found stories of a handful who swore lasting affection for it.

Later I'd begin to appreciate finer aural distinctions with the Beats Solo Pro and the AirPods Pro. A separate topic, but what's relevant is the way both managed to render the gentle bass of soft, ambient audio in a way that wasn't clumsy or obnoxious, but faithfully gentle; rather like a blanket or an unrealistically soft mattress. However they did it, I could tell it wasn't easy.

The original HomePod managed to do that to a room. Oddly, therein lay its awkwardness for me: its bass felt unpredictable in its capability – when watching Star Trek, for example, I feared a dramatic orchestral cue or brief action sequence would overwhelm me or disturb people in adjacent rooms. This was partially because of the HomePod's way of handling voices, which – even with the Apple TV's "reduce loud sounds" option always on – seemed to quieten and restrict them to more vocal-centric frequencies, perhaps to more finely delineate that contrast between aural "elements" for which it was marketed. The inability to influence its decisions was disheartening, and I learned to treat this as the proposition: if you want to be able to adjust your audio, don't buy a HomePod. It's purpose-specific. If you buy one, it's because you want to talk to Siri and play music only from your Apple devices, Apple Music, or a handful of other Internet sources.

The HomePod mini's restrictions are identical, but after coming to fear and revere the HomePod's prowess, I found myself hoping the Mini – with its single driver and unbeliedly minuscule bearing, the only Apple product the size of an apple – would liberate me from it.

It turns out the audio is essentially the difference. I doubt I'd have noticed if I hadn't lived on a HomePod, but hearing Siri's voice from the Mini for the first time made me think "wow, that's tinny."

Again, by comparison. The "American female" voice dwells above the bass clef, but even it benefitted somehow from the original HomePod's subtle muscle.

It's clear the Mini did liberate me in the way I hoped. It can't produce the enveloping fundamental aural layer of a soft thunderstorm or a deep set of pads, but it makes the attempt, and the result of the attempt is about what I was hoping for. Instead of the unrealistically comfortable mattress, it's perhaps a premium body pillow. My non-audiophile life in no way diminished my appreciation for the expansiveness of simple stereo, and while the original HomePod felt like more than enough, I'm tempted to pair my Mini in the future.

This drives home for me that different speakers and different headphones just sound different, period. Audio engineers may wish this wasn't so, but it's a better lesson to embrace than fight.

Surprisingly, the display proved a noteworthy difference as well – more to my mind than those of most reviewers I've read from. Here's what I noticed:

Thank you, Cuthbert.

Many mammals, through their interactions with each other and with humans – their happiness, sadness, scrutiny, fear, excitement, and perhaps even humour – eventually had me guessing all mammals feel more alike inside than they might seem.

Cuthbert, sadly no longer among the animals at the Caenhill farm, might be the first who did the same for avians. Just look at him during this rush hour. Lest you think that's a one-off, Cuthbert earned his reputation as the farm's viral mascot through equally striking regular appearances, though not always wearing his heart on his wing: I recall him deferring his arrival until after the first hectic moments, affording him the chance to strut at distinguished leisure with his two flanking companions, speaking whatever he supposed he was speaking.

Cuthbert never recovered from the fight that crippled him, but at least he had plenty of company and care, and Chris' dutiful report of his death was simple and touching.

The career technologists who founded and developed the Internet, a destined farmer, and this particular, irreplaceable goose. None of those could be either other, and the insight and inspiration so widely shared could only have occurred with all three.

Perhaps the first time a goose has ever been thanked by a distant human he would never have known.

The eve of Apple’s “One More Thing” event.

We know what that means. After a long morning of exciting developments in hardware and software, Steve Jobs would mildly make as though to wrap up, before idling into: “but there is one more thing.” Not always, but sometimes he’d have saved the best announcement – the one that would resonate in the news and through the months and years – for last.

As happens, history has magnified the flavour of the memory. Having taken his place as CEO, Tim Cook daredn’t utter the phrase until the announcement of the Apple Watch, which five years hence has sold over 30 million times. The cheering audience seemed to consider it an earned usage, as with its reutterance preceding the iPhone X in the first keynote in the Steve Jobs Theatre. “We have great respect for these words, and we don’t use them lightly,” Tim said.

I’d say this evening’s usage is a pretty heavy one. And it should be.

They haven’t said it, but it’s clear this event is about the Mac. What has been said this year: Apple is ready with macOS Big Sur, dubbed version 11 after two decades of nominally incremental upgrades to version 10. It’s a largely user-facing redesign which folds in ideas from iOS and feels like it might belie thoughts about future hardware.

Secondly, the announcement it was time for a fundamental shift from imported Intel processors to Apple’s ARM-based processor designs, called “Apple Silicon,” which have sped the iPhone and iPad to impressive maturity.

I’m no low-level hardware expert, but I’m aware this sort of transition is immensely complex. A major operating system upgrade is complex enough, but in this case the whole operating system must be rewritten, or at least somehow recompiled or translated, not even to mention the countless apps available for it.

In its pattern, the transition resembles the Mac’s similar transition to Intel in 2005. A robust developer transition kit with full emulation, a line of new Intel-based devices in the works, and an operating system which had been secretly compiled in parallel from its dawn. (This is, incidentally, where the word “Mac” made its way into every product that would run it; for example, “MacBook.”)

Why were Steve’s turns of phrase, like “one more thing,” so memorable? I think Steve allowed his sense of taste, everyday humanity, and common sense to undercut the technical concerns which had guided the computer industry. Design choices flowed less from questions like “what’s the logical next step from a programmer’s perspective,” and more from questions like “what would make sense?” or “what would be great?”

That’s the trust he gained with his keynote’s audiences, and with Apple’s customers. While hardware and software evolved, the company wouldn’t be afraid to step way further back to reconsider anything or everything. When Steve explained a product, you could expect it to make sense in a way much else didn’t. Indeed, he’d occasionally invite applause after a stride with a casual “make sense?” In this way, Apple – but initially, the Mac – became the expression of a certain soul or spirit. The Mac was often the computer of artists, of creators, of Robyn and Rand Miller, of Stephen Fry and Douglas Adams. Of me in later grade school, thanks to insightful teachers.

After these announcements tomorrow, with the adoption of a look influenced both by what Apple has learned from its recent adventures and by its history, and with the freedom to part with a longtime partner and grow further into itself, I hope to feel this spirit shines brightly as ever. We’ll see.

Alex Trebek's final answer.

Alex found the way to a career he enjoyed enough to master and execute unflinchingly for 37 years.

Jeopardy! wasn't air or food or drink or health care. And even if it wasn't education, it would still have been a significant part of generations of lives in North America and beyond.

The format of the "game show" will undoubtedly resonate into the future on television and the Internet, and Trebek was one of its defining hosts. On and off the show, his plain expression and sophisticated vocalizational tone perfectly belied the wit, insight and heartfeltness found in his words.

It's strange to think something someone enjoys so selfishly can benefit so many others, and it's a thought with which I think pondering Alex's example will make me personally more comfortable.