File systems
May. 6th, 2026 06:09 amFor an external USB 5GB+ spinning disk (not SSD), is HFS a better choice than APFS? Assume no weird edge cases like spanning volumes or RAID are involved. It's just a disk.
It is very easy to find either answer, but hard to find one that sounds like it's from someone who knows what they are talking about, and isn't just cargo-culting it or reading from a press release. So show your work.
“Traditionally, fonts were just shapes.”
May. 6th, 2026 04:32 amShould you ever be worried that displaying just one glyph could take almost 2 seconds and slow down your website by as much? Naw, of course not. This wasn’t a problem already in the 1980s and, in the lord’s year 2026, computers are pretty good at rendering a letter or a symbol at a moment’s notice.
Ha. I was just messing with you. Of course you should always be worried about fonts. All the time. Typography is beautiful, but fonts are brutal. They will constantly put you to the test, they will find ways to get out of alignment faster than a Zastava Yugo, and they will teach you about corner cases in places you didn’t even realize had edges.
Fonts will break your heart like it’s the month before the prom again, and again, and again.
Or, in Allen Pike’s case, break a heart somewhat literally. Pike wrote a nice quick story of the complexity of what needed to happen to show the heart emoji, and how under a very specific set of conditions – a certain browser, a certain emoji font, a certain emoji within that font – this led to an extreme slowdown.
What’s really interesting is that in order to fix it, Apple can either improve Safari or the font itself, and at the moment of writing, it wasn’t clear what was the right thing to do. (Oh, yeah. Fonts don’t just have bugs. Fonts have many kinds of bugs.)
Another interesting in-between-the-lines thing is that Apple’s emoji are perhaps the only survivor of the original skeuomorphic pre-iOS 7 era. Even today’s emoji party like 2008 never ended – still glossy, still textured, still bitmapped. I’m curious whether somewhere deep inside Apple, there exist exploratory designs for flat, vector versions of emoji that never saw the light of day.
It’s five answers to five questions. Here we go…
1. My boss punished me for an HR investigation on her way out the door
A little over a year ago, I started in a new workplace. Things seemed great at first — much less stress and a more regular schedule than my previous job, great coworkers, and when I had a significant health scare requiring multiple surgeries (I’m fine now) shortly after starting, my manager was really supportive. As the honeymoon period waned, however, it became clear that there were a lot of serious boundary issues with our manager — lots of “we’re a family” style issues. Inappropriate, boundary-crossing things were being said, things that made a lot of jaws hit the floor when recounted. Long story short is that I ended up reaching out to HR, with the support and knowledge of most of my peer-level coworkers. The hope from me had been she would get coaching around professionalism (like not asking invasive personal/medical/sexual questions of employees during staff meetings).
There was an investigation, and my manager sort of spiraled. She revoked several privileges (like flexible work) suddenly (for most people, but notably not for everyone). And she would lash out emotionally about perceived slights, and made at least one person cry. Based on the way she channeled her aggression, it seemed like she was working through the people she suspected of reporting her.
Fast forward a few months, and she announced that she was leaving. I was already scheduled to take an approved vacation during her last week in the office. When I returned, she was gone and she had submitted my annual review in my absence, which included rating me as “approaching expectations” (as opposed to meeting) across multiple categories, saying that my “interpersonal conflicts are a distraction to [me] and the team” and that I don’t take constructive criticism well. This was about a week ago.
I think she received some kind of confirmation that I reported her, and I am pissed. I feel like I have no recourse because she is gone. If she was still here I would ask, in good faith, for examples, because I try to be open to the possibility that there is room for improvement. But I have never had an “interpersonal conflict” with anyone at work except for my decision to report to HR, and I cannot think of a single instance of criticism she provided, constructive or otherwise!
Do you think there’s anywhere to go with this? I feel like this was retaliatory, but she doesn’t work here anymore. And I worry that bringing it up with upper management will just be held against me. Do I just need to breathe deeply, move on, and try to start fresh with a new manager when/if they ever hire someone?
Go back to HR and say this: “I’m concerned that Linda’s annual review of me was intentionally retaliatory because of my report about her to you. She had seemed very upset ever since the investigation, began revoking various privileges for people, and lashed out at multiple team members. The review is so out of sync with the feedback she’s given me previously that — with some of it objectively incorrect — that I’m concerned it was retaliation for my report and the subsequent investigation. I’m not sure how to handle this since she’s now gone, but I’m concerned about having this in my personnel file when it’s false.”
Related:
my boss retaliated against me in my performance evaluation after I talked to H.R.
2. My manager keeps firing people without any warning
My job employs a lot of part-timers, mostly younger people with little to no previous work experience. I’m one of several supervisors. Our main job is to support the part-timers, but our manager regularly asks for our input on things like hiring, policy changes, training, etc.
My manager is normally very good, and I’ve described her as the best boss I’ve ever had many times. She’s great at keeping multiple plates spinning, training new people effectively, project management, and giving good feedback. Unfortunately, the late-2024 federal funding cuts have hit us hard and compounded with other problems to result in my department running on a skeleton crew for months now. My manager has gotten noticeably more snappish, impatient, and overworked as a result. I’m full-time and grateful to be employed at all, especially since I’ve been looking for new jobs with no interviews for about a year, so I’ve been grinning, bearing it, and repeating, “That’s what the money’s for” to myself when she occasionally treats me somewhat unfairly out of stress.
However, she’s fired multiple part-timers over email with no warning since January. I think it’s unfair, arbitrary, and unnecessary. All of the people who were fired had attendance issues that are fireable offenses, but there are other workers with worse attendance who haven’t been fired because they’ve been here longer and/or my manager feels bad for them. I do too, but my manager has had months of in-person and email conversations with one employee warning her that she needs to hit a minimum amount of shifts with no improvement. The people who were fired got, at most, a vague hint over email that we needed them to shore up their attendance. There was never a face-to-face conversation with our manager making it clear that their jobs were on the line if they kept skipping shifts.
Do you have any ideas for ways I could pump the brakes on this fire-by-email trend, keeping in mind I have no hard power here? And should I start trying to warn employees with shaky attendance that our manager might fire them with little to no warning? On one hand, I want to keep out of the line of fire and just get my work done without making my boss think I’m trying to undermine her. On the other hand, I think our casual office culture has lulled some part-timers into a false sense of security, and these are undergrads without much work experience who might not realize that skipping shifts or even entire weeks of work is a lot more serious than skipping class. On a third hand, I’m busy enough as it is and about to get busier, so I don’t really want to throw yet another responsibility into the mix.
Talk to your manager! It shouldn’t take a huge amount of capital if you approach it as wanting what’s best for the organization, rather than taking issue with her judgment. Frame it as, “I know we’ve had to fire a bunch of people for attendance issues lately, and I think part of the problem is that we have so many people without much work experience who don’t yet understand what a big deal it is. Could we more explicitly warn people when their attendance is an issue? It might let us solve the issues without ultimately having to fire them, which would help lower the strain from the turnover.”
But also, yes — as a supervisor you should definitely be talking to employees about attendance expectations, even if your manager isn’t. You know she has specific attendance expectations (as most jobs would!), whether or not she’s going to talk to them about it — so if you see people running afoul of those, you should name it and let them know it’s a problem. You don’t need to say, “Jane might fire you with little to no warning”; you can say, “Reliably showing up when you’re scheduled is a requirement for keeping your job, and it’s something we do fire people over.” As a supervisor, you have the standing — and, I’d argue, the obligation — to have those conversations.
Related:
should you warn an employee before firing her?
3. I’m continually passed over for the higher-level responsibilities we discussed when I was hired
I have been in my role as office manager and EA to the CEO for six years. Prior to taking this role, I was second-in-charge at my workplace, and functionally in a COO role. I took a step down when accepting my current role as it’s a more interesting industry and allowed better flexibility.
When taking the role, the CEO and COO talked about training me into the COO role, particularly as she was planning on taking long service leave. However, every time I have asked to learn parts of her role, it’s been pushed back or ignored (e.g., “oh yes, maybe,” then nothing).
This week I asked if I would be covering her role while she is on long service leave and was told that another team member would be doing it. The CEO seemed suprised that I was interested in doing it. I have definitely made it clear in all my reviews that I’m interested in getting back into a more executive role.
I consistently receive positive feedback on my work from the CEO and COO. I regularly ask if there is anything I need to improve, and am always told they are very happy. I’m not sure what to do now. I like where I work, but it seems like I will not be given the chance to improve my career.
You need to ask her about it directly: “When I was hired, you and Jane talked about training me into the COO role since I was doing that role in my previous job. Is that still something you’re open to and, if so, what kind of timeline do you envision for that happening?”
Since it’s been six years with no movement on it, it’s possible that she doesn’t even remember those conversations. If that’s the case, just saying in your review that you’re interested in moving back in that direction won’t necessarily solve it; it will be more effective to very clearly lay out what the original discussion was and ask if it’s still on the table.
It’s possible that it’s not, for all sorts of reasons (anything from they’ve pigeonholed you into the job you’re now in to their thinking on who they’d want in that role having changed in the years since the original discussion). But if that’s the case, you need to find out so you can decide if you want to stay under those circumstances or if you’d be better off looking outside the organization.
4. Glassdoor is making you link your account with Indeed
Remember how we were so annoyed a while back when Glassdoor started making you add your real contact information to keep your account? Apparently now they have been bought by Indeed, and they are forcing you to connect your accounts. I didn’t even have an Indeed account, and it wouldn’t allow me to log into Glassdoor until I made one. You then have to search through settings to opt out of letting company “job posters” on Indeed have access to your Glassdoor account information! It’s opt OUT!
Clearly some boneheaded exec either has it in for Glassdoor as a concept or really does not understand the point of it. I’m going to have to delete my account and make a new one under a fake name now. Why do they have to make everything terrible??
What the actual F. Anonymity is essential for Glassdoor to work so what a terrible and nonsensical policy that drains Glassdoor of most of its utility.
5. Can I ask for a start date two months away?
I work in an industry where giving a month’s notice is expected from managers. After years of working in a very intense job, I’m considering a move to greener pastures. But wondering how to negotiate the latest date possible. If possible, I’d love to have a month off between jobs to truly rest, recharge, and see my extended family. Doing so would give employers two months wait for my start date. Is that possible and how do I ask without sounding as burnt out as I feel?
In a lot of jobs, you can ask for a start date two months out. Some will have the flexility to agree to that and some won’t, but it’s a thing people ask for, particular with more senior-level jobs. You’d simply say, “I’m expected to give my employer a month’s notice, and I’m hoping to take some time off to recharge before starting with you. I can be flexible if needed, but would a start date of X work on your end?”
Related:
how do I negotiate my start date at a new job?
The post my boss punished me for an HR investigation, manager keeps firing people without any warning, and more appeared first on Ask a Manager.
Choir
May. 6th, 2026 04:01 am
also maggie animated a fun scene for instagram you gotta click through to see
but mostly: pledge kickstarter etc
Apple Cuts More Mac Studio and Mac Mini RAM Options as Memory Shortage Worsens
May. 6th, 2026 12:36 amJuli Clover, MacRumors:
Apple has removed more desktop Macs from its online store as the global memory shortage continues. Mac mini models with 32GB and 64GB of RAM are no longer available for purchase, nor is the M3 Ultra Mac Studio with 256GB RAM.
The M3 Ultra Mac Studio is now available only in a 96GB RAM configuration, with higher-tier options eliminated. Both M3 Mac Studio and M4 Max Mac Studio models have delivery estimates of 9 to 10 weeks.
Chance Miller, 9to5Mac:
Last March, Apple was hit with a class action lawsuit after delaying the launch of the “more personalized Siri” that was first announced at WWDC 2024. Apple agreed to settle the case in December, and the full settlement terms are now available. Apple is set to pay $250 million to settle the lawsuit, equating to an estimated $25 per device. That number could reach up to $95 per device, depending on how many users submit claims. [...]
As part of the settlement, Apple is not admitting any wrongdoing. The company continues to assert that “it acted in good faith and in a manner reasonably believed to be in accordance with all applicable rules, regulations, and laws.” In a statement to 9to5Mac, an Apple spokesperson said:
Since the launch of Apple Intelligence, we have introduced dozens of features across many languages that are integrated across Apple’s platforms, relevant to what users do every day, and built with privacy protections at every step. These include Visual Intelligence, Live Translation, Writing Tools, Genmoji, Clean Up and many more.
Apple has reached a settlement to resolve claims related to the availability of two additional features. We resolved this matter to stay focused on doing what we do best, delivering the most innovative products and services to our users.
A $25/device settlement sounds about right. Apple ran ads showing features that still haven’t shipped. That they honestly intended to somehow ship those features, as promised, doesn’t mean the ads didn’t wind up being false.
“Who thinks about a screwdriver?”
May. 5th, 2026 11:39 pmI found this 9-minute video from Rex Krueger about screwdriver handle design really interesting in the context of my post about Photoshop’s dialogs.
Screwdriver handles evolved over the decades in response to user needs and usage patterns, with a few clever affordances: some for everyone, some for specific use cases that might not be obvious.
I think by now all the basic onscreen UI elements – input fields, pop-up menus, checkboxes, buttons, top menus, sliders, and so on – have similar richness, as do all the core input devices like a keyboard, a mouse, a trackpad, or a touch screen.
That doesn’t mean that everything is set in stone, that no changes are possible, and that stuff that fell out of favour can ever be taken away – after all, computer usage, input devices, and conventions are evolving much faster than screws at this point – but that one has to be aware of the history so that the changes are intentional, not accidental.
A few select comments from under the video that I found interesting:
The Craftsman handles are also different colors for Phillips and slotted screwdrivers.
The fluted handle was patented. So anyone else wanting to make a screwdriver would have to pay the patent holder. So they tried alternatives to make more money. That is the real reason until the patent expired. Plus if they invented a “better” way and held the patent, others would have to pay THEM.
The Swedish word for screwdriver is “skruvmejsel” with literally translates as “screw chisel.”
The Big Idea: Martha Conway
May. 5th, 2026 09:17 pm
Do we as a society tend to abide by the phrase, “if you love something, let it go,” or are we more likely to dig our claws in and refuse to part ways? Author Martha Conway discusses in the Big Idea for her newest novel, We Meet Apart, just how impactful the absence of family members and loved ones can be, and what it feels like to be left behind.
MARTHA CONWAY:
When I was twenty-three, three of my five older sisters divorced themselves from our family. They took care to tell me that their issues were with my parents, not me, but nevertheless, I didn’t see or hear from them in over ten years. They didn’t attend my wedding, which hurt me deeply—it seemed to me that their non-relationship with my parents was more important to them than a relationship with me.
My feelings back then were tumultuous. I missed my sisters, I was angry, I was confused, and I was sad—often, it felt like, simultaneously. Later, when my mother died quite suddenly, I felt the same way: an avalanche of mixed emotions.
What do you do when a loved one leaves, or dies? Would you follow them if you could, even if it meant giving up your own independence, your own future? And how do you honor all the many emotions you feel without drowning in them?
In my speculative historical novel We Meet Apart, two American sisters find themselves stranded in Ireland in 1940, but in two separate worlds. They believe their whole family has died. One sister, Gaby, is devastated with grief but lives a comfortable life; her younger sister Sabine is angry and must fight to survive in a war-torn country. When they finally meet—for only an hour a day, at dusk, in that thin veil between two worlds—they must decide whether to stay together or part, probably forever. Staying together is familiar and comfortable, but it doesn’t allow for their personal growth. Parting means growth, separation, and possibly danger.
As I was writing this novel I found myself wondering: can a person give up a loved one voluntarily? And what are the consequences? What are the consequences of hanging on?
The older I get, the more often I hear a similar story to my own from friends and acquaintances: they have a family member who is “off stage” or “out of the family” or “not speaking to the rest of us.” The shame I once felt around my own broken family has lessened, knowing that others have had this experience, too.
Today I have a good relationship with two of these sisters, but it took time. Partway through writing We Meet Apart, when it became clear to me that one sister was going to go her own way, I felt a kind of acceptance. Children grow up, families change, siblings relocate, and the nuclear family shifts into another form. Sometimes, when it happens suddenly and without warning, it feels more impactful. But it always happens, to one degree or another. As the saying goes, the only constant in life is change.
We Meet Apart: Amazon|Barnes & Noble|Bookshop
Author’s Socials: Website|Facebook|Instagram|Substack
The Pentagon Pegs the Cost of the Iran War, So Far, at $25 Billion
May. 5th, 2026 09:55 pmTaegan Goddard, quoting the Financial Times last week:
The Pentagon said President Trump’s Iran war has cost the United States at least $25 billion, driven primarily by the military’s use of munitions, the Financial Times reports.
The New York Times had an interesting piece trying to put that number in context (gift link):
$25 billion is similar to:
- The annual budget of NASA.
- Spending on military aid to Israel after Oct. 7.
- Spending by U.S.A.I.D. before it was disbanded.
- The cost to expand Obamacare subsidies for one year.
These are all comparisons to other aspects of the U.S. federal budget. It’s interesting also to use this in comparison with the current moment in tech:
- OpenAI’s latest valuation of $852 billion (I love the 2) equals 34 Iran Wars.
- Anthropic’s $380 billion valuation equals 15 Iran Wars.
- Apple’s current four-year U.S. manufacturing commitment of “more than $500 billion” equals 20 Iran Wars.
- Google expects to spend between $91–93 billion in capital expenditures this calendar year, mostly related to AI infrastructure. That’s a little over 3 Iran Wars this year.
- Larry Ellison currently has a net worth of $220 billion. That’s just short of 9 Iran Wars. But since the start of the war on February 28, his net worth has grown $46 billion. That’s about 2 Iran Wars during the time of the actual Iran War thus far.
★ Software as the Product of Obsession Times Voice
May. 5th, 2026 09:01 pmBack in 2009, Merlin Mann and I jointly gave a talk at SxSW titled “Obsession Times Voice”. Regarding how it turned out, I wrote:
My muse for the session was this quote from Walt Disney: “We don’t make movies to make money; we make money to make more movies.” To me, that’s it. That’s the thing.
Merlin and I were talking about independent writers and podcasters, because that’s what we were (and remain), but the concept applies just as perfectly to independent developers. This came to my mind after reading (and linking to) David Smith’s description of the new Pedometer++ today. Not just what it does, but why he spent six years making it. That’s the sort of productive obsession that fascinates me.
Ice water is always refreshing, but it tastes better when you’re on a road trip to hell. It feels like the world of software is bifurcating quality-wise. This whole thing about Adobe’s new craptacular “modern” UI language (a.k.a. “Spectrum”) exemplifies one side of that bifurcation — the bad-and-getting-worse side. Software that is the product not just of an ignorance of long-established principles of interaction design, but of a willful disdain for those principles. What Adobe is now shipping is just inexplicably bad UI, ignoring literally decades of great work and long-mastered concepts — a lot of which work was pioneered by Adobe itself!
The whole thing with MacOS 26 Tahoe is similar. To be clear, the UI crimes in Tahoe are deeply worrisome, but they are nowhere near as severe as those in Adobe’s Spectrum. But the problems with Tahoe are steps down the same fork in the road that Adobe took years ago. Spectrum is where Tahoe suggests that MacOS was headed under Alan Dye’s leadership: cross-platform sameness for the sake of sameness, with a complete disregard for longstanding platform nuances and idioms. In Spectrum’s case those platforms are MacOS and Windows and the web. In Tahoe’s case it’s MacOS and iOS.1
The other side of the software fork is not deserted. It’s just populated, more than ever, by the products of small independent developers who obsess, first and foremost, over quality and artistic vision. Remarkable new software gems exhibiting spectacular UI design appear all the time. They’re just not coming from the biggest companies, the ones whose apps, alas, dominate not just our desktops and pockets but our entire culture today.2
There’s always been software with poorly designed user interfaces. Much of it has been successful financially, sometimes spectacularly so. I’d argue, in all seriousness, that that’s the story of Microsoft in a nutshell. What’s new today is poorly designed software from developers from whom we expect better. In the old days there were people who would argue that prioritizing good user interface design was a waste of time — like spending hours decorating cupcakes destined for kindergarteners who are simply going to mash them into their mouths. (Again: cf. Microsoft’s undeniable market success.) What’s new today is people holding up objectively bad interaction design and proclaiming it to be good, and the product of teams that purportedly prioritize “design”, when it’s clear they have no idea what they’re talking about. It’s one thing to make something poorly designed and shrug on the grounds that it doesn’t matter. It’s another thing to make something poorly designed and hold it up as good design.
We are justified to expect nothing short of insane greatness from Apple, and solidly good design from Adobe. In principle, all software ought to have well-designed user interfaces. That’s never going to be the case. But software for designers — Adobe’s raison d’être — absolutely demands to be well-designed itself, like how a book on writing must itself be well-written.
Perhaps I was wrong, though, to describe Adobe’s new UI as inexplicable. It’s just indefensible. The explanation for so much software going so rotten from a UI-design perspective is, the more I think about it, related to Nilay Patel’s “Software Brain” theory, which I’ve commented on both directly and indirectly. Here’s Patel’s definition of “software brain”:
The simplest definition I’ve come up with is that it’s when you see the whole world as a series of databases that can be controlled with the structured language of software code. Like I said, this is a powerful way of seeing things. So much of our lives run through databases, and a bunch of important companies have been built around maintaining those databases and providing access to them.
Zillow is a database of houses. Uber is a database of cars and riders. YouTube is a database of videos. The Verge’s website is a database of stories. You can go on and on and on. Once you start seeing the world as a bunch of databases, it’s a small jump to feeling like you can control everything if you can just control the data.
But that doesn’t always work.
You might think it counterintuitive that a movement obsessed with software would be spearheading a severe decline in the design quality of software, but in Patel’s definition, there’s no concept of software as art, as a practice, as a craft. Software brain is purely an obsession with software as a medium in and of itself. A means with no consideration for the end.
Framed in Walt Disney’s adage, software brain makes software only to make more money. The idea of making money in order to make more software — to afford the time and talent to craft it — does not compute. Framed in the metaphor that Steve Jobs used to close his introduction of the original iPad, and returned to again to close his final keynote at WWDC 2011, software brain is nowhere near the intersection of technology and the liberal arts. Software brain is so far down Technology Street that it’s no longer in the same zip code as Liberal Arts Avenue. Another way, perhaps, to define software brain is that it’s the utter rejection of Jobs’s maxim that “technology is not enough”. With software brain, technology is all there is.
-
I don’t want to belabor the similarities between Adobe’s Spectrum UI system and Apple’s Liquid Glass, because there are significant differences. Foremost, what’s wrong with Spectrum is wrong everywhere. Photoshop with Adobe’s new “modern” UI is, I suspect, just as bad a Windows app as it is a Mac app. Whereas the usability problems with Liquid Glass are lopsided platform-wise. It’s a litany of disasters on MacOS 26 Tahoe, but actually pretty good on Apple’s other version 26 OSes, especially iOS. There are aspects of Liquid Glass on iOS 26 that some people don’t like, but they’re literally skin-deep. Cosmetic details. Functionally, iOS 26 is pretty strong, and Apple made some very nice changes regarding the placement of things like search fields to improve consistency system-wide. I still have iOS 18 running on my year-old iPhone 16 Pro, and there are very few things I prefer in iOS 18 versus iOS 26. Whereas I’d be sick if I had to work in MacOS 26 Tahoe every day.
That’s my point here. iOS 26 doesn’t suffer in any way — not even one teensy little single way — from MacOS UI idioms being inappropriately applied to the iPhone. On the iPad, maybe there’s a little of that, like, say, the weird way iPadOS 26 uses Mac-style red / yellow / green window control buttons but makes them too small to use, so before you use them, you need a gesture to embiggen them temporarily first. But the implementation of “Liquid Glass” on MacOS Tahoe is just riddled with iOS-isms that aren’t appropriate on MacOS. So many decades-old Mac UI nuances and idioms were just ignored. They weren’t changed, they weren’t updated, they were just ignored. You either see that this is true or you don’t, and if you don’t see it, you shouldn’t be designing the Mac user interface. ↩︎︎
-
Consider the age of television. Television is the broadcast of motion pictures with sound. Cinema is an artform. But at the peak of television’s hegemony over western culture and mass media, the artistic quality of almost everything on TV was terrible. It was slop. It wallowed in its own sloppiness. This, despite the fact that cinematic artists had largely mastered the artform in the decades preceding TV. TV became popular in the 1950s and culturally dominant in the 1960s. But Citizen Kane came out in 1941. The network executives with “TV brain” in the second half of the 20th century didn’t even consider TV as a medium for art. They just cared that it was watched. It was judged only by ratings and ad revenue, not artistic merit. That’s what’s happening with software right now. But remember too, that as dreadful television programming rocketed to stratospheric popularity in the 1970s, that same decade saw a remarkable explosion in innovative filmmaking in movie theaters. Keep the faith. ↩︎︎
The Other Patreon: Joyce’s Birthday Suit!
May. 5th, 2026 09:27 pmI have a NSFW Patreon, did you know? And while usually it’s just shitposting naked people, today, May 5, is Joyce’s canonical in-universe birthday! Even though it’s, like, February right now for her in-story. But out here in the real world, it’s May 5, and so here she is in her birthday suit! It’s important to be attired appropriately.
This is nuts: Fred Again has uploaded a video of every...
May. 5th, 2026 09:01 pmThis is nuts: Fred Again has uploaded a video of every single show he did during his USB002 tour (except Mexico City) — it’s four and a half days long. “im told this is the longest video on YouTube ever?”
Movie Posters by Eric Rohman
May. 5th, 2026 07:52 pm




Some nice work here from Swedish designer Eric Rohman, who designed thousands of movie posters in the early-to-mid 20th century. (via meanwhile)
Tags: design · Eric Rohman · movie posters · movies
The official trailer for Christopher Nolan’s The...
May. 5th, 2026 07:16 pmThe official trailer for Christopher Nolan’s The Odyssey was just released. Really looking forward to this.
What now-familiar domain names looked like before they...
May. 5th, 2026 06:28 pmWhat now-familiar domain names looked like before they were bought by big-time companies, e.g. openai.com was “the personal homepage of a guy named glenn”, doordash.com was a porn site, threads.com sold spools of thread.
asking people to do a one-week work trial before offering them the job
May. 5th, 2026 05:59 pmA reader writes:
I saw an ad for a job at a company that says they ask candidates to spend 3-5 paid days working with them before they’ll make an offer. Their ads reads, “Spending 3-5 days in person working together on a real problem is so much higher signal than interviews could ever produce.” They also say that almost every candidate they hire says they love the experience and wouldn’t want to take a job without a work trial in the future because they learned so much about how the organization operates.
Curious for your thoughts on this. It seems like a great way to screen for desperate folks without current jobs? Or is it just obvious rage-bait?
Well, on one hand, of course you learn more about candidates by working with them for five days (and they learn more about you) than you do in an interview. In a vacuum, it makes perfect sense! Some people interview really well but aren’t so good once you see them on the job. And from the candidate’s point of view, some managers sound great in an interview and turn out to be nightmares once you’re on the job.
The problem, though, is that our system isn’t set up for this. It’s not realistic for most people to be able to take off three to five days from their job (out of whatever limited vacation time they have for the year), and possibly on short notice, to do this. If someone is unemployed, it gets easier — but a ton of candidates will have to have jobs, and this isn’t a reasonable expectation to put on them.
Plus, imagine that lots of companies started doing this, and that you’d have to do multiple work trials before one ended in an offer. You could easily blow through your full amount of vacation time for the year, or even exceed it, just doing work trials.
I do think it’s a great idea, for some jobs, to ask finalists at the very end of the process to complete a sample work project and pay them for it. I’ve done that before, and you learn a ton that you didn’t necessarily see in the interview and it can really differentiate your best candidates. But that’s a much lower burden than asking someone to spend a week with you.
Interviews aren’t a perfect system — far from it. But week-long work trials aren’t a reasonable solution for most people.
The post asking people to do a one-week work trial before offering them the job appeared first on Ask a Manager.



