\"Writing.Com
*Magnify*
SPONSORED LINKS
Printed from https://web1.writing.com/main/profile/blog/cathartes02
Rated: 18+ · Book · Opinion · #2336646

Items to fit into your overhead compartment


Carrion Luggage

Blog header image

Native to the Americas, the turkey vulture (Cathartes aura) travels widely in search of sustenance. While usually foraging alone, it relies on other individuals of its species for companionship and mutual protection. Sometimes misunderstood, sometimes feared, sometimes shunned, it nevertheless performs an important role in the ecosystem.

This scavenger bird is a marvel of efficiency. Rather than expend energy flapping its wings, it instead locates uplifting columns of air, and spirals within them in order to glide to greater heights. This behavior has been mistaken for opportunism, interpreted as if it is circling doomed terrestrial animals destined to be its next meal. In truth, the vulture takes advantage of these thermals to gain the altitude needed glide longer distances, flying not out of necessity, but for the joy of it.

It also avoids the exertion necessary to capture live prey, preferring instead to feast upon that which is already dead. In this behavior, it resembles many humans.

It is not what most of us would consider to be a pretty bird. While its habits are often off-putting, or even disgusting, to members of more fastidious species, the turkey vulture helps to keep the environment from being clogged with detritus. Hence its Latin binomial, which translates to English as "golden purifier."

I rarely know where the winds will take me next, or what I might find there. The journey is the destination.
Previous ... -1- 2 3 ... Next
April 24, 2025 at 2:08am
April 24, 2025 at 2:08am
#1087899
After doing a bit on color earlier this week (see "Blue My MindOpen in new Window.), I found this related 2022 article/interview from Knowable Magazine.

         Color is in the eye, and brain, of the beholder  Open in new Window.
The way we see and describe hues varies widely for many reasons: from our individual eye structure, to how our brain processes images, to what language we speak, or even if we live near a body of water


Now, I don't have a whole lot to say about it; I consider this to be more of a follow-up. I didn't expect it to follow-up so soon, but such are the perils of random number generators.

Some people are color-blind. Others may have color superpowers.

The former seems to be linked to a Y chromosome; the latter, to an XX pair. Make of that what you will; I call it semirandom genetic variation. Like how the gene complex for calico cats is linked to the XX. For whatever it's worth, I seem to have perfectly standard color vision, but I have two friends, both male, both with Irish ancestry, who are colorblind to different degrees.

That's not science, by the way. That's an observation of a couple of data points.

To learn more about individual differences in color vision, Knowable Magazine spoke with visual neuroscientist Jenny Bosten of the University of Sussex...

And the rest of the article is an edited transcript of that interview.

As I said, I don't have much to say, for once. So I'm not going to reproduce parts of the interview. I will note that they do make mention of The Dress, which I also covered in an entry fairly recently, but that was in the old blog.

There's also some reinforcement of what I said before: that the spectrum is, well, a spectrum, with way more than seven colors. Some say there are millions. I'm pretty sure the actual number is finite, at any rate. To reiterate, the "seven colors" thing can be traced back to Newton, who associated them with other mystical sevens like the Sun and Moon plus five visible planets.

And at the end (spoiler alert), it reiterates the philosophical question (I say philosophical, because we don't have a scientific means of testing this yet) that Kid Me posed: do I see the same colors that you do? I don't know. I also don't know that it matters except in terms of satisfying one's curiosity.

So, maybe tomorrow I'll have more to say. We'll see. And we'll see in different colors.
April 23, 2025 at 8:42am
April 23, 2025 at 8:42am
#1087848
Today, we're taking a look at an age-old conundrum about eggs. No, not whether or not they preceded chickens (from an evolutionary perspective, they did), or why they're still so expensive, but, well, I'll let The Conversation explain it:



I'm tempted to answer "no." Most question headlines are answered "no."

You might have heard that eating too many eggs will cause high cholesterol levels, leading to poor health.

Researchers have examined the science behind this myth again, and again, and again – largely debunking the claim.


And yet, it persists, because people tend to remember only the first word on a subject, not its later retractions.

A new study suggests that, among older adults, eating eggs supports heart health and even reduces the risk of premature death.

This. This is why people don't trust nutrition science.

The article describes the study's methods. Then:

The research was published in a peer-reviewed journal, meaning this work has been examined by other researchers and is considered reputable and defensible.

At least, that's how it's supposed to work. Sometimes, though, things slip through.

Researchers received funding from a variety of national funding grants in the United States and Australia, with no links to commercial sources.

I'm glad they included this line, because funding can induce bias, even unconscious bias: a desire to get more funding, so we'll tend to produce the result they want. All those studies about how great dark chocolate is for you? Well, they might not be wrong, but they're suspicious because they were paid for by Willy Wonka.

Due to the type of study, it only explored egg consumption patterns, which participants self-reported. The researchers didn’t collect data about the type of egg (for example, chicken or quail), how it was prepared, or how many eggs are consumed when eaten.

There may be other confounding factors, too. It may be a correlation-not-causation thing: what if the high-level egg-eaters also had other dietary habits that are known to be heart-protective, like eating rabbit food?

The article notes other limitations of the study, then goes off on some other scientific research, apparently unrelated to eggs. This may be an editing issue.

Here's the important part, though:

The fuss over eggs comes down to their cholesterol content and how it relates to heart disease risk. A large egg yolk contains approximately 275 mg of cholesterol — near the recommended daily limit of cholesterol intake.

In the past, medical professionals warned that eating cholesterol-rich foods such as eggs could raise blood cholesterol and increase heart disease risk.

But newer research shows the body doesn’t absorb dietary cholesterol well, so dietary cholesterol doesn’t have a major effect on blood cholesterol levels.


Unlike the egg thing, the relationship between cholesterol and heart disease risk is well-founded. What wasn't well-founded, apparently, was the idea that eating foods that naturally contain cholesterol, like eggs, is linked to blood cholesterol levels.

While the science is still out, there’s no reason to limit egg intake unless specifically advised by a recognised health professional such as an accredited practising dietitian.

And I say we're already neurotic enough about food. Worrying about it so much can't possibly be good for you.
April 22, 2025 at 10:21am
April 22, 2025 at 10:21am
#1087788
In keeping with the spirit of yesterday's entry ("Avoid snakebites by not going outside,") here are some tips from Lifehacker to mess with.

    10 of the Most Ridiculous Fees (and How to Avoid Paying Them)  Open in new Window.
I didn't realize I needed generational wealth to check my bag at the airport.


Checked bag fees? The only reason they exist is because people insist on the cheapest possible flight, which they determine before they discover that there are about fifty add-on fees in addition to the base cost, and checked bags are but one of them. Eliminate checked bag fees, and airlines will all just raise their prices the same amount. (They also have the added bonus of causing people to fight for overhead bin space for... you know... carrion.)

Processing fees. Service fees. Hidden fees. It feels like most companies and services these days have found countless ways to sneakily squeeze money out of me.

And then, on top of that, they expect you'll pay their employees for them by tipping them. Not to mention begging for money for questionable "charities."

Even when it’s only a few dollars here and there, it’s the principle of the thing: Why am I being charged in the first place?

Because they want your money. And because they can.

Concert “service” fees

If you’ve tried to buy a concert ticket in recent history, you’ve been slapped with a shocking string of processing, commission, or transaction fees.


How to avoid: stop going to concerts, like I did. Mostly because, in some cases, the add-on fees more than doubled the price of the ticket. But also because I refuse to go to any venue named after a company, which most of them are, these days.

This is a somewhat different situation than airline add-on fees. It's not like there are two or more ticket merchants trying to sell passes to the same concert at the same venue (usually). There's no "competition" reason like with airlines looking to appear to have the lowest price. No, they do it because once you've decided $200 is a perfectly reasonable price for nosebleed seats at a rock concert, the sunk cost fallacy takes over and you end up paying another $300 for processing, convenience, and Ferengi fees.

Airbnb fees

Next to the cost of concert tickets, Airbnb has gained notoriety for its bullshit fees. I've found the growing consensus is that Airbnb simply isn’t worth its exorbitant service fees.


Solution: avoid AirBnB, like I do. Maybe at first it made sense, but now they're having a measurable negative impact on the housing market, among other negative social consequences. Hotels can have sneaky fees too, but they tend to be lower. And from what I've heard, with AirBnB, you generally have to do your own cleaning, which is anathema to the whole point of going anywhere. I don't clean my own house; why should I clean someone else's? (I don't live in filth; I hire a service.)

Seat selection and airline fees

Of all the bullshit airline fees these days, “seat selection” might be the shadiest.

I happen to disagree. Those are disclosed up front, during seat selection, and if you don't like it, feel free to cheap out in a middle seat.

ATM fees

When you need cash fast, ATM fees are tough to avoid.


Are they? I haven't paid an ATM fee in decades, unless you count the foreign currency exchange fee I paid exactly once, in Paris. Mostly, you just have to have the right bank.

Car dealership fees

If there’s someone you can trust to be honest and no-bullshit, it’s a car salesman, right?


Oh, a funny person. Hey look, everybody, it's a comedian!

Seriously, though, I've bought two cars in the past 20 years, so I'm no expert here. I can't say "Don't buy a car," though, because most of us either need one or would be seriously inconvenienced without one. I do wish I could just order the car I want online, like I do with computers, rather than deal with high-pressure sales tactics and end up with something other than ideal for me.

Gym initiation fees

When you join a new gym, your first bill might come with an “initiation fee.”


There are other ways to get exercise without going to the gym. Turns out that once you're giving them money monthly, it stops being an incentive for you to go.

Credit reports

Make a habit of checking your free credit score from sites like Credit Karma or Experian.


I will admit to having a Credit Karma account. It's free, and they're up front about the reason for it being free: they advertise credit and banking services. Even without that, though, there's probably no reason to check your credit report except maybe once a year, or if you suspect identity theft. Maybe if you're about to apply for a loan, but that can roll into the "once a year" thing.

Overdraft fees

“Overdraft protection” sounds like a positive thing to stop you from taking out more money than you have in your account. However, when the bank offers overdraft protection, they charge quite the fee for it.

Sigh. There's a really easy way to avoid these, too: don't fucking overdraw your account. I know, I know, it takes work, and maybe math. If you don't like it, then don't complain about overdraft fees. Not to mention that emergencies do happen. And, again: bank shopping.

Bank statement fees

A paper bank statement can come with a wild $2 or $3 monthly fee.


Wow, whoever put this article together sucks ass at picking banks.

Online shipping fees

As much as I'd like to support in-person brick and mortar stores, sometimes Amazon one-day shipping is the only option I have. And with shipping costs these days, I know I'm guilty of buying more products just to qualify for free shipping—the classic "spend to save" trap.


Well, there's your problem right there. Consider buying less shit. Fewer shit? Whatever.

Bottom line (complete with service fees) is, companies get away with hidden and extra fees because we let them. There are some things it's worth paying extra for, like, for me, streaming without commercials, or an internet connection that doesn't require me to pay Comcast a dime. Also, shoes. Don't skimp on shoes. But this obsession with always getting the cheapest everything can get more expensive and time-consuming in the long run.
April 21, 2025 at 9:52am
April 21, 2025 at 9:52am
#1087704
People make things harder than they have to, sometimes. Case in point from Outdoor Life:

    7 Ways Not To Die From A Rattlesnake Bite  Open in new Window.
Longtime Outdoor Life contributor Michael Hanback is back with tips for avoiding snakebites


1. Don't go where rattlesnakes are.
2. Stay indoors and maybe stay on sidewalks if you must leave the house.
3. Don't go outside.
4. Play in traffic.
5. Avoid the outdoors.
6. Stay home (and don't let snakes in no matter how much they beg).
7. Definitely don't visit Australia.

Technically, if you do all these things, your chance of dying from a snake bite is low, but never zero. You will definitely die from something else, though.

Now, here's where I tell you that the article does include photos of some very cute (but potentially deadly) nope ropes, though other people may not find them as pretty as I do. Yes, I like snakes. From a distance, unless I know they're nonvenomous.

I’ve seen a few snakes here and there, but I’ve never even had a whiff of a close call with a venomous one.

Yeah, you have. You really have. You just didn't know it.

That is, until one day last June on a remote stretch of the Appalachian Trail in Western Virginia.

Which is pretty much the only place you'll find a rattlesnake (or it will find you) in Virginia. Growing up, we had copperheads and cottonmouths to deal with, but also the nonvenomous and very useful blacksnake. Well, we called them blacksnakes; their more official name is northern black racer, which is a damn cool name. The really, truly official name is Coluber constrictor constrictor, which is also cool and would make a great band name for anything but a Whitesnake cover band. My dad kept one around (much to the chagrin of my mom) and named him Goldberg "because if I named him Nixon, nobody would trust him."

Someone had run over his tail at some point (I'm still unclear as to where a snake's body ends and their tail begins), but Goldberg got around just fine.

Anyway, back to rattlesnakes.

I saw a flash in the rocks beside the footpath and peered down at a timber rattler as thick as a forearm, coiled six inches from my right boot! With glinting hints of yellow and green in the midday sun, it was both beautiful and terrifying.

Well, I'm glad I'm not the only one who can appreciate a danger noodle. Difference is, I don't go out looking for them.

Wherever you roam, your chances of a potentially dangerous rattlesnake encounter are small.

They're even smaller if you stay in the car.

1. Know Where Rattlesnakes Live

...and don't go there.

2. Know When Snakes Are Active

When it's warm. They're reptiles.

3. Gear Up Smart

I recommend full plate armor.

A guy at REI told me that in a tight situation, a thick wool sock could turn fangs, though Heaven forbid you or I ever have to find out!

Yeah, I'm not going to bet my life on what one guy tells me.

4. Watch Your Step

You know that Gadsden flag the crazies have co-opted? With the "Don't tread on me" slogan? Yeah, don't step on snakes.

5. Watch Your Reach

After ankles and legs, most snake bites occur on hands and arms.

That fear you have if you're a guy and you need to relieve yourself off-trail? Yeah, that's not going to happen. One might get your legs in that situation, though, and then you're yelling and everyone sees you with your dick hanging out, writhing around and screaming.

6. Stay Back!

Better yet, don't go where snakes are.

7. Don’t Panic

Ah, yes, useful advice in any situation, especially interstellar hitchhiking.

So, yeah, the article provides way more practical advice if you simply must go hiking in the woods for some reason, like you're hiding from the cops or something. But there are plenty of other reasons to stay indoors; snake bites are scary but fairly uncommon compared to, say, ticks, spiders, scorpions, and all manner of other arachnids. Or even having a branch fall on you. More common than quicksand, though, which TV shows when I was a kid convinced me were all over the place outside.

Meanwhile, I'll just stay on my deck and avoid the most dangerous thing that climbs up onto it with me: opossums. Which aren't even as cute as snakes.
April 20, 2025 at 8:48am
April 20, 2025 at 8:48am
#1087645
I'm not overly familiar with the source of the article I'm featuring today. It's from Open Culture, which bills itself as having "the best free cultural & educational media," and already I distrust it because if you're really the best, you don't need to self-promote as such.

But we're going to look at this article anyway.



I'd heard this assertion before, but I don't think I've ever blogged about it.

The article, incidentally, contains a video with a similar title. I didn't watch it. I don't know if it covers the same material as the writing. I prefer writing over videos.

In an old Zen sto­ry, two monks argue over whether a flag is wav­ing or whether it’s the wind that waves. Their teacher strikes them both dumb, say­ing, “It is your mind that moves.”

That reminds me of how an optimist and a pessimist argue over whether a glass is half-full or half-empty, when, clearly, the glass is twice as big as it needs to be.

Such obser­va­tions bring us to anoth­er koan-like ques­tion: if a lan­guage lacks a word for some­thing like the col­or blue, can the thing be said to exist in the speaker’s mind?

It's a fair question, I'll admit, but it seems to me that lots of things exist that we don't have words for.

We can dis­pense with the idea that there’s a col­or blue “out there” in the world. Col­or is a col­lab­o­ra­tion between light, the eye, the optic nerve, and the visu­al cor­tex. And yet, claims Maria Michela Sas­si, pro­fes­sor of ancient phi­los­o­phy at Pisa Uni­ver­si­ty, “every cul­ture has its own way of nam­ing and cat­e­go­riz­ing colours.”

Can we really dispense with that idea, though? Just as sound is a pressure wave in a medium such as air, water, or something solid, regardless of whether there is an ear around to hear it (so much for the "tree falls in the forest" Zen koan), color is a particular wavelength on the electromagnetic spectrum. I'd argue that insofar as color exists at all, being not a "thing" but a property of a thing, that wavelength that we agree on as "blue" exists, too. How we perceive that color is, to me, a separate issue.

The most famous exam­ple comes from the ancient Greeks. Since the 18th cen­tu­ry, schol­ars have point­ed out that in the thou­sands of words in the Ili­ad and Odyssey, Homer nev­er once describes any­thing — sea, sky, you name it — as blue.

I'd heard that, of course, but I still have questions, like: How do we know they didn't use the word for blue if they didn't have a word for blue? And, more importantly: Why are we trusting the color descriptions of a blind poet?

It was once thought cul­tur­al col­or dif­fer­ences had to do with stages of evo­lu­tion­ary devel­op­ment — that more “prim­i­tive” peo­ples had a less devel­oped bio­log­i­cal visu­al sense.

Yeah, evolution doesn't really work like that. No matter how primitive the culture seems to our technological senses, people are generally the same, genetically speaking, all over.

“If you think about it,” writes Busi­ness Insider’s Kevin Loria, “blue doesn’t appear much in nature — there aren’t blue ani­mals, blue eyes are rare, and blue flow­ers are most­ly human cre­ations.”

Well, yeah, but unless you live in London or Seattle, there's this big thing-that-isn't-a-thing called the daytime sky, which we describe as blue. It's pretty hard to miss unless you live in a cave, which even cavepeople didn't do all the time.

The col­or blue took hold in mod­ern times with the devel­op­ment of sub­stances that could act as blue pig­ment, like Pruss­ian Blue, invent­ed in Berlin, man­u­fac­tured in Chi­na and export­ed to Japan in the 19th cen­tu­ry.

I did a blog entry a while back on that particular pigment; as I recall, it featured in Japanese ("The Wave") and European ("Starry Night") art. But I have my doubts about that being the origin of our shared perception of the color blue. Newton did a lot of study of the color spectrum, breaking up sunlight using a prism like the proto-Dark Side of the Moon cover art, and he included "blue" as a color. I should note, however, that Newton seems to have chosen a seven-color scheme (the actual spectrum covers a lot more than seven shades) because of the mystical association with the number seven: days of the week, visible heavenly bodies that move (sun, moon, and five planets).

One mod­ern researcher, Jules David­off, found this to be true in exper­i­ments with a Namib­ian peo­ple whose lan­guage makes no dis­tinc­tion between blue and green (but names many fin­er shades of green than Eng­lish does). “David­off says that with­out a word for a colour,” Loria writes, “with­out a way of iden­ti­fy­ing it as dif­fer­ent, it’s much hard­er for us to notice what’s unique about it.”

I kind of agree with that, though. There's the old story about how the Inuit have many different words for snow; it's probably false (not least because there are several languages and dialects involved), but it does speak to the larger truth that we name the things we find to be important in our lives.

It's an interesting line of inquiry, though. When I was a kid, I remember making an offhand comment to a friend like, "How do I know that the colors I see are the same as the colors you see? Like, we can both agree that this grass is green, but if I could see through your eyes, would I see the same color as I do now?" Those weren't my exact words, which I don't remember, but whatever I said, he understood what I was getting at. Much later, after the internet became a thing, someone echoed my childhood Zen koan and got a bunch of mind-blown reactions.

As far as I know, we can't know the answer to that, not yet. Perhaps someday.
April 19, 2025 at 8:38am
April 19, 2025 at 8:38am
#1087547
Okay, BBC again today; I expect everyone has heard the "signs of extraterrestrial life" story by now.



For starters, the announcement followed the trajectory I expected: from "we found evidence of the signature of a gas in an exoplanet's atmosphere that, on Earth, is only produced by living organisms" (close to the truth) to "they found evidence of life" to "hey, they found aliens!"

...the news that signs of a gas, which on Earth is produced by simple marine organisms, has been found on a planet called K2-18b.

At least the BBC isn't blowing it out of proportion.

K12-18b is totally a Star Wars droid name, though.

Now, the prospect of really finding alien life - meaning we are not alone in the Universe - is not far away, according to the scientist leading the team that made the detection.

Okay, well, he'd know better than I would, but it seems to me that a flicker of light in a spectroscope isn't the same thing as finding alien life. And I really wish they'd worded that better, because in the popular imagination, "alien life" translates right to "little bald dudes with big eyes, death rays, and flying saucers." When what they really mean is "microbes."

But all of this prompts even more questions, including, if they do find life on another world, how will this change us as a species?

Ah, and now we get into the cutting-edge philosophical question. You know. The one science fiction has been exploring for over a century now. The one we've been mulling over at least since Schiaparelli found illusory channels on Mars, which got translated to English as "canals," so of course everyone thought "Martians." And yet they present it as if it's some sort of strange, new idea.

To its credit, the article does talk about the Mars thing.

But decades on, what has been described as "the strongest evidence yet" of life on another world has come, not from Mars or Venus, but from a planet hundreds of trillions of miles away orbiting a distant star.

I'd be remiss if I didn't note that, speaking of Venus, there was a big announcement a few years ago about finding spectroscopic evidence of life-produced gases in the thick, steamy atmosphere of that planet. Which turned out to be false and was summarily retracted. If we can get false positives from our closest orbital neighbor, I'm just that much more skeptical about finding it on Star Wars Droid Planet dozens of light-years away.

Skeptical, I'll emphasize, doesn't mean "in denial." I'd love for it to be a solid discovery. I've said before that I really hope that we find indisputable evidence of extraterrestrial life during my own lifetime. Thing is, though, that it will be, as this article hints, a paradigm shift in our understanding of the Universe, and so the evidence needs to be more than circumstantial. "Extraordinary claims require extraordinary evidence," and all that.

But just because I want it to be true doesn't mean I'm going to fall for hype.

As these so-called exoplanets were being discovered, scientists began to develop instruments to analyse the chemical composition of their atmospheres. Their ambition was breathtaking, some would say audacious.

The idea was to capture the tiny amount of starlight glancing through the atmospheres of these faraway worlds and study them for chemical fingerprints of molecules, which on Earth can only be produced by living organisms, so-called biosignatures.


We don't know everything. This is a good thing, because it fuels exploration. But in this case, it means that just because a gas is only farted out by life here on Earth, that's not necessarily the only way to produce it. So, to me anyway, simply finding a biosignature is promising, but it's not enough for indisputable evidence of ET life.

Prof Madhusudan, however, hopes to have enough data within two years to demonstrate categorically that he really has discovered the biosignatures around K2-18b.

And I "hope" to have enough money within two years to buy a Central Park West penthouse.

But even if he does achieve his aim, this won't lead to mass celebrations about the discovery of life on another world.

Instead, it will be the start of another robust scientific debate about whether the biosignature could be produced by non-living means.


That's my point: finding these biosignatures is like a big "let's look at this more closely" signal. As the article notes, we've found nearly 6000 exoplanets (not to mention the other seven in our backyard, plus several potentially life-harboring moons). The signs help us decide which ones to focus on for more observations and data.

A much more definitive discovery would be to discover life in our own solar system using robotic space craft containing portable laboratories. Any off-world bug could be analysed, possibly even brought back to Earth, providing prima facie evidence to at least significantly limit any scientific push back that may ensue.

We've had this, too. I remember a meteorite they found in Antarctica, determined to have been blasted off Mars by some ancient impact there, that contained features associated with life. As with the Venusian atmosphere thing, this turned out to be a false positive.

The European Space Agency's (ESA) ExoMars rover, planned for launch in 2028, will drill below the surface of Mars to search for signs of past and possibly present life. Given the extreme conditions on Mars, however, the discovery of fossilised past life is the more likely outcome.

And look, let's not underplay that. Even finding fossilized (as we spell it here) former life on Mars would be a Big Fucking Deal. But again, it hasn't happened yet.

Nasa is also sending a spacecraft called Dragonfly to land on one of the moons of Saturn, Titan in 2034. It is an exotic world with what are thought to be lakes and clouds made from carbon-rich chemicals which give the planet an eerie orange haze, bringing The Beatles‘ song, Lucy in the Sky with Diamonds to mind: a world with "marmalade skies".

A side note: I've been playing the video game Starfield off and on for the past couple of years. It's not as good as Bethesda's prior offerings (Skyrim and Fallout 4, e.g.), but that's irrelevant. What's relevant is you can, in the game, fly around to other planets, moons, and star systems and find elements to mine to sell or to use in crafting. And one of the elements you can find on Titan is titanium. I don't think there's any scientific basis for that placement; I just find it to be an amusing pun.

But I digress. The BBC article then emphasizes (or emphasises) what I consider to be the most important point in all of this:

If simple life forms are found to exist that is no guarantee that more complex life forms are out there.

Prof Madhusudhan believes that, if confirmed, simple life should be "pretty common" in the galaxy. "But going from that simple life to complex life is a big step, and that is an open question. How that step happens? What are the conditions that govern that? We don't know that. And then going from there to intelligent life is another big step."


I've been saying that for years. I do wish he hadn't used the word "intelligent," though. That just begs someone to make the self-contradictory joke about not finding any intelligent life on Earth. So I'll just add that "intelligent" doesn't automatically mean they build telescopes and rockets (or flying saucers); lots of species here on Earth are intelligent without the capability or desire to do that.

Dr Robert Massey, who is the deputy executive director of the Royal Astronomical Society, agrees that the emergence of intelligent life on another world is much less likely than simple life.

Of course, I could be wrong, but it is nice to have some backup from actual scientists.

[Massey quoted here]"When we see the emergence of life on Earth, it was so complex. It took such a long time for multi-cellular life to emerge and then evolve into diverse life forms.

"The big question is whether there was something about the Earth that made that evolution possible. Do we need exactly the same conditions, our size, our oceans and land masses for that to happen on other worlds or will that happen regardless?"


Not mentioned: as I understand things, the grand evolutionary leap that made what he's calling "complex" life possible was the merging of an archaeum and a bacterium, creating a more energy-efficient cell and leading, eventually, to every life-form we can see on Earth today: fish, trees, cats, and us, for example. The simple cells are called prokaryotes; the more complex ones with the nucleus and organelles and internal structure you might remember from high school biology are called eukaryotes. There is some question about whether that happened only once, or several times, but either way, it took a long damn time to happen.

One of the many cool things about finding ET life, even simple life, will be the data it provides to help us understand that remarkable upgrade.

As he puts it, centuries ago, we believed we were at the centre of the Universe and with each discovery in astronomy we have found ourselves "more displaced" from that point. "I think the discovery of life elsewhere it would further reduce our specialness," he says.

And that's where I diverge from him philosophically. Just being able to ask these questions makes us special. Potentially being able to answer them would only increase our specialness. In my opinion. At the same time, though, I think I understand the underpinning of his assertion: that humans don't get to proclaim that the universe was formed specifically for them.

Never before have scientists searched so hard for life on other worlds and never before have they had such incredible tools to do this with. And many working in the field believe that it is a matter of when, rather than if, they discover life on other worlds.

I can't argue with that.

So, yeah, that's a lot of words, both in the article and here. And I've touched on many of these points in past entries, but this recent discovery prompted me to revisit my arguments. But if you skipped all the text until now, I would ask that you just remember this: finding simple life off-Earth does not mean there are other cultures Out There pointing telescopes back at us or getting ready to invade or whatever. But finding so much as a microbe, or its alien equivalent, would change our perspective in a big way.
April 18, 2025 at 9:55am
April 18, 2025 at 9:55am
#1087473
Occasionally, I like to share articles about the most profound inquiries into reality. Like this one from BBC:



When you see pasta, your brain probably doesn't jump to the secrets of the Universe.

Then you don't know my brain. Though it's usually beer that makes it jump to the secrets of the Universe.

You might think physicists only ask the big questions.

Yes, and this is one of them.

But physicists, of course, have ordinary lives outside of the laboratory, and sometimes their way of questioning the Universe spills over to their daily habits. There's one everyday item that seems to especially obsess them: spaghetti.

Come on, now. Just admit you like it. You don't have to claim it's "for science" like I do when I concoct a new cocktail recipe.

The steady torrent of spaghetti science helps to demonstrate that deep questions lurk in our ordinary routines, and that there are plenty of hungry physicists who can't stop asking them.

Me, I've always wondered why tomato sauce is hotter than anything else around it. It's basically food napalm. Like, maybe you have a pizza in the oven. You can pull that sucker out and touch the crust right away. Get any of the sauce on your hands, though, and you're looking at second-degree burns at least. Yes, I've looked it up and it has to do with heat capacity and water content combined with viscosity, or some shit like that, but that just pushes the question over to the next subject.

Italian food: fueling scientific inquiry since 200 BCE.

For example: how thin can spaghetti get?

The article answers this question, though not without confusing the issue by using both SI and imperial units. Just stick with millimeters, okay? Spoiler: the thinnest comes in at 0.4mm.

But recently, a team of researchers at the University College London wondered if 21st Century lab equipment could do better. They used a technique called "electro-spinning". First, they dissolved flour into a special, electrically charged solution in a syringe. Then they held the syringe over a special, negatively-charged plate.

That's all very special.

The world's thinnest spaghetti is just one recent example of how physicists can't seem to stop plying their tools on everybody's favourite carb. But physicists using their noodle on their noodles is no new thing.

I would have been severely disappointed if the BBC had not made the "using their noodle" pun. You may think the BBC is a stodgy and serious news outlet, but I know better. Hell, their video player has a volume control that goes to eleven.

In 1949, Brown University physicist George F Carrier posed "the spaghetti problem" in The American Mathematical Monthly, which he deemed to be "of considerable popular and academic interest". Essentially, the problem amounts to: "Why can't I slurp up a strand of spaghetti without getting sauce on my face"?

His problem is he was American. Only Americans slurp spaghetti. Maybe Brits, too, but definitely Americans, and it's Wrong. You're supposed to twirl it around the fork and eat it neatly. This leads to another massive physics problem, though, which is the mathematics of strand entanglement.

For now, no theoretical physicist has attempted the more complicated problem of two dogs slurping from either end of the same spaghetti strand.

See? What'd I tell you? A Disney movie reference, of all things.

The great mid-century American physicist Richard Feynman helped unlock the riddles of quantum mechanics, explaining how the elementary particles that make up atoms interact with one another. But Feynman's enormous contribution to spaghetti physics is less widely known. One night, Feynman wondered why it's almost impossible to break a stick of spaghetti into two pieces instead of three.

I'd actually seen an article about this phenomenon before. I might have even blogged about it; I can't remember. But again, though Feynman was undeniably a genius (I believe he was even smarter than Einstein), he was also American, and thus didn't know that you're not supposed to break the spaghetti before cooking it.

My (Italian-American) mother taught me to break a bundle of dry spaghetti in half before putting it in boiling water, so it fits horizontally in the pot.

Yeah, well, it must be the "-American" part that thought that was a good idea.

I guess Feynman did the same, but it's an outrage to many of the world's spaghetti-eaters.

Kind of like putting pineapple on pizza.

There are a few other examples of spaghetti science, then:

Spaghetti physics even goes beyond the pasta itself – sauce is loaded with its own scientific mysteries. When eight Italian physicists met while doing research abroad in Germany, they found a shared frustration in the classic Roman dish cacio e pepe.

Such a simple dish with few ingredients, and yet very difficult to perfect.

"This is actually a very interesting problem," says Daniel Maria Busiello, co-author on the cacio study. "So we decided to design an experimental apparatus to actually test all these things."

The "apparatus" consisted of a bath of water heated to a low temperature, a kitchen thermometer, a petri dish and an iPhone camera attached to an empty box. They invited as many hungry friends as they could find to Di Terlizzi's apartment and hunkered down to cook a weekend's worth of cacio e pepe.


That. That is why I love science.

The physics they used connects the clumping of cacio e pepe to ideas about the origin of life on Earth.

And that.

Not mentioned in the article: strand entanglement. I really want to know if they've solved the math behind that. It has nothing to do with quantum entanglement, though.

...or does it?
April 17, 2025 at 9:33am
April 17, 2025 at 9:33am
#1087414
I'm back from my trip to NYC. Gotta say, the skyline still doesn't look right to me without the Twin Towers. Speaking of places that are no longer there, here's a list from Mental Floss:

    10 Places That No Longer Exist  Open in new Window.
Not even a compass could help you get to these lost places.


Not even a compass, no, but those of us who read science fiction and fantasy are used to traveling to places that don't exist. At least in imagination.

There are places that are hard to get to, places that are less explored than others, and places you’re forbidden from visiting...

One of my other perennial sources, Atlas Obscura, is pretty good at the "hard to get to" and "less explored" places. As for "forbidden," sometimes I want to visit anyway, but I'm not bold or sneaky enough to do so.

...and then there are places that you might want to go to, but they no longer exist. From land masses wiped out by changing climates to waterfalls erased by human action, here are 10 spots you won’t be able to put on your vacation bucket list.

I've used the term "bucket list" unironically before, so I don't have an inherent aversion to the term, but I'm not sure it's necessary here.

I also have a "fuck-it list" for places that I intend to go if/when the mood strikes.

Now, I should note two things before continuing: 1) There's a YouTube video embedded in the article. I didn't watch it. I'd rather read than watch. But it's there if you feel differently; it appears to cover the same topic. 2) There are also helpful pictures in the article, which I won't reproduce here. So I'm just going to highlight a few I have something to comment on.

2. The Pink and White Terraces

New Zealand was once home to what was widely called the Eighth Wonder of the World: the Pink and White Terraces.


I'd like to give them credit for the name. But I can't. It's descriptive and all, but if you're not going to use the native name for a thing, at least come up with something more creative. I'd probably have called them the Hello Kitty terraces, but these days, maybe the Barbie terraces.

The Māori had long valued the Pink and White Terraces; they viewed them as taonga, meaning “a treasure.”

Which still doesn't tell me what the Māori actually called them. Wikipedia  Open in new Window. did, though: Te Otukapuarangi or Te Tarata.

Until 1886, that is. On June 10 of that year, Mount Tarawera erupted.

A brief search didn't verify this, but at least that volcano name seems to be partly Māori.

But my main point here is that it's not always humans at fault for making places disappear; geology does a good job of that by itself.

3. Rungholt

There was once an island named Strand off the northwestern coast of what’s now northern Germany. In January of 1362, a cyclone known as the Grote Mandrenke, or “Great Drowning of Men,” caused a storm surge that wiped parts of the island off the map. With them went the medieval town of Rungholt. For centuries after Rungholt’s disappearance, people spoke of it as if it were a mythical lost city (its remains may have been found in 2023).


It is possible that several "mythical lost cities" had their origin in real cities wiped out by natural disasters.

4. East Island

People aren’t the only ones who suffer when islands disappear. After a 2018 hurricane, East Island—part of the French Frigate Shoals of the Hawaiian Islands—was swallowed by the sea.


I used to be under the impression that hurricanes were Atlantic and, if a tropical cyclone formed elsewhere, it was called something else, like a typhoon. Turns out "hurricane" is apparently the right nomenclature for Hawai'i (central Pacific) as well. And the northeastern Pacific.

5. Doggerland

Doggerland was a large swath of land that once connected Great Britain to continental Europe.


I've been wondering about that place since I first heard of it. Is it the origin of some flood myths? Has anyone done underwater archaeology there? (Turns out the answer to the second question is yes.)

9. Old Man of the Mountain

For centuries, an old man’s face loomed over New Hampshire, peering out from the side of Cannon Mountain. The Indigenous Abenaki called him “Stone Face,” while the white settlers referred to him as the “Old Man of the Mountain.” Except it wasn’t an old man at all: It was a rock.


Well, at least this one lists the Native name. Or a translation of it.

Anyway, I remember when it crumbled. Not that I was there, but it was all over the news in 2003, a stark reminder that everything is ultimately ephemeral. That, I say, is how we know it's real.

10. Nuna Supermountains

The Nuna supermountains stretched across an entire supercontinent and formed roughly 2 billion years ago.


That one, I don't remember. I was too young.

Have you been somewhere that no longer exists? I'd bet you have. For me, the WTC towers (as mentioned way up at the top here) is a notable example, but there are also some bars I used to go to that I dearly miss.
April 16, 2025 at 1:29am
April 16, 2025 at 1:29am
#1087349
Today, we'll talk about some moons that aren't The Moon, for once. This article, from PopSci, concerns something you may or may not have heard about last month:

    Which planet has the most moons? Saturn dethrones Jupiter.  Open in new Window.
The International Astronomical Union recognized 128 newly discovered moons orbiting the ringed planet.


Just to be clear, I'm not questioning the finding. What I am going to question is the definition of "moon."

The ringed gas giant Saturn has officially replaced Jupiter as the planet in our solar system with the most moons. The International Astronomical Union officially recognized 128 new moons orbiting Saturn, bringing the new total up to 274 moons.

And that's certainly a lot of moons.

The moons were discovered by a group of astronomers from Taiwan, Canada, the United States, and France. Between 2019 and 2021, they used the Canada France Hawaii Telescope to repeatedly monitor the sky around Saturn.

While one could wish for a more creative name, at least "Canada France Hawaii Telescope" is descriptive.

As of February 2024, Jupiter has 95 moons. By comparison, Mercury, and Venus are moonless, Earth has one moon, and Mars has two. Uranus and Neptune have 28 and 16 known moons, respectively. Despite not technically being a planet anymore, Pluto has five moons.

And here's where I'm going to get picky. No, not about Pluto's designation. I honestly don't care what they call it. I understand why it got demoted, and can't fault the logic. But the point is there was a method to it. They decided what should constitute a planet, and Pluto didn't make the cut with the new definition. Also, if you want to get really technical about it, Pluto's largest satellite, Charon, is more like a companion world; they orbit a point between the two of them.

Thing is, okay, so we have this definition of "planet" that excludes Pluto, Charon, Ceres, Vesta, Eris, etc. It's not solely about size, but the size of the world is a factor.

So we get to "moon."

The 128 new Saturnian satellites are all considered irregular moons. These are objects that orbit their host planet on an elliptical, inclined, or backwards path.

Which, again, is fine, but at some point, don't you have to call them something else? The two moons of Mars are small and irregular, with odd orbits, and are probably captured asteroids. Some of the moons in the outer solar system are bigger than Mercury. There's a continuum in between. There's also a continuum of bodies orbiting a planet ranging from small-planet-sized all the way down to very small rocks, pebbles, grains of sand, even dust.

And that's where my issue comes in. Saturn's rings have been known for a while now to be made up of really small chunks of mostly ice, though there are some larger bodies in there. Every one of those specks could be considered a moon, giving Saturn not 146, not 274, but probably millions of "moons."

The Wikipedia bit on moon,  Open in new Window. or "natural satellite," states: 'There is no established lower limit on what is considered a "moon".' While this could probably be worded better, it shows what I mean: there's no minimum size. Consequently, Saturn has, since early telescopic observation, always been known to have more moons than Jupiter does (the latter also has a ring system, but nothing like Saturn's distinctive feature).

One could argue, I suppose, that it's impossible for telescopes on or near Earth to resolve each speck of dust in Saturn's rings individually, so they shouldn't be called moons. But if so, come on, IAU: get together and agree on a definition like you did with "planet." We could use something new to argue about, because the Pluto thing is getting really stale.
April 15, 2025 at 3:38am
April 15, 2025 at 3:38am
#1087295
No! Not when I just learned it! Not really sure why this is in Atlas Obscura, but I'll run with it.



When someone on the phone—the doctor’s office, the bank, the credit card company—asks for my name, I always offer to spell it out—it’s a pretty uncommon surname.

I've quit offering and just do it.

This uses what is what’s called a “spelling alphabet,” or, confusingly, a “phonetic alphabet.”

It is nearly impossible to distinguish, say, a B from a D, or an M from an N, without coming up with a word starting with one of those letters. But if you just pick one off the top of your head, you can make things worse. Like "B as in bed" gets heard as "so that's D as in dead?"

That's why we have the standardized spelling alphabet.

The British military came up with the first few examples, just for letters they found the most difficult: “P as in pip,” “B as in beer.”

Yeah, the Puritanical Americans probably came up with something like "B as in boring."

I can't complain too much, though; "Whiskey" is the official spelling word for W.

A tremendous amount of research, time, and money was invested into figuring out the optimal spelling alphabet—at least for the three languages that the International Civil Aviation Organization (ICAO, the United Nations agency that handles air transportation) felt significant enough to have one (English, French, and Spanish).

Perhaps we begin to see why some want to update the spelling alphabet: there are now many other languages in the chat.

It’s certainly the most commonly used spelling alphabet in the world, but it is, as most of these alphabets are, exceedingly Anglocentric.

And yet, they replaced Beer with Bravo, much to the detriment of, well, the world.

Other languages have come up with their own spelling alphabets. Some needed wholly new ones, such as Russian, which uses the Cyrillic alphabet. “Г as in Григо́рий” is the Russian version of “G as in Gregory.” Japanese and Mandarin Chinese both have their own letter-based alphabets (Kana and Pinyin, respectively) in addition to their traditional logographic alphabets (in which symbols stand in for whole words or phrases, rather than just sounds).

I must admit, it has crossed my mind in the past to wonder how the East Asian language speakers handled such things, but never enough to actually, you know, look it up.

Some languages that use the Roman alphabet, as English does, have letters of their own. Take Æ in Danish and Norwegian, which is usually given “Æ as in Ægir,” a figure from Norse mythology.

And also, like, how they handled accented letters in French and Spanish, which this article mentions briefly.

Voice call quality has gone down over the past two decades.

Yeah, it turns out no one really wanted to make phone calls, anyway.

Sure, much communication has moved over to text, email, and social platforms, but everyone still needs to talk on the phone sometimes.

And the most common phrase uttered in such a phone call is "What'd you say?"

Independent of their use in military and aviation capacities, we sort of need spelling alphabets now more than ever. The problem is that what we’ve been given by the 50-year-old standard is deeply flawed for modern use.

Only if you care about non-Anglophones.

“We know in speech perception that frequent words are much more easily heard in noise than infrequent words,” says Hazan. That why it is a pretty poor choice to use, say, “S as in Sierra” (the standard) instead of “S as in sugar.”

Yeah, I don't buy it. "Sugar" is pronounced with a very nonstandard 'sh' phoneme up front; "Sierra" is not. I'd pick "Sucks," myself, but that's not going to happen.

Hazan, in 2006, was asked for a BBC Radio story to see if she could come up with a better spelling alphabet.

There's more at the article, but basically:

Turns out there was effectively no difference between the new, improved spelling alphabet and the old standard. If certain letters were in certain places in the nonsense combination, the new version might be more effective; in other places, the old version was. No difference! After all that!

Hey, at least they were tested rather than just assumed to work, like with the older alphabets.

This can be partly explained because people have just grown familiar with the whole “alfa-bravo-charlie” thing. It’s in books and movies, it’s just one of those things we absorb without thinking about it.

And there's just something satisfying about saying Whiskey Tango Foxtrot, even though it's twice as many syllables than the original "What the fuck?"

So, no, I don't think we need to come up with a new spelling alphabet, except in terms of expanding it to allow for different language alphabets. Besides, I just recently (almost) mastered this one, and I don't want to go through the whole memorization thing again. It would be like if English were to suddenly decide to simplify its spelling: sure it would be easier going forward, but all of us who have worked to learn and deal with the idiosyncratic spellings we have now wood be todaly steemed.
April 14, 2025 at 7:51am
April 14, 2025 at 7:51am
#1087224
Today, in "things we'd like to believe," from MedicalXpress:



Well, sign me up! For some of those things, anyway. Never did get a taste for coffee.

In all seriousness, though, we shouldn't be taking any of these nutrition science studies at face value as presented, whether they tell us what we want to hear or not.

A diet rich in produce such as grapes, strawberries, açaí, oranges, chocolate, wine and coffee can reduce the risk of metabolic syndrome...

Okay, even if this one study is definitive, what about other health issues besides metabolic syndrome?

...according to the findings of a study involving more than 6,000 Brazilians...

Can't fault the sample size on this one, though.

...the largest in the world to associate the effects of consuming polyphenols with protection against cardiometabolic problems.

"But, but, but, I can't pronounce polyphenols, so I shouldn't be eating them!"

Seriously, though, at least they reveal the key chemical up front. Though calling it a chemical will freak some people out. I don't care. Everything you eat or drink contains nothing but chemicals.

Polyphenols are bioactive compounds with well-known anti-oxidant and anti-inflammatory properties.

Can't be arsed to look it up because I have limited time due to travel, but I'm pretty sure the polyphenols are what started the "red wine is good for you" craze a couple decades back. Since then, they've waffled back and forth on the subject, depending on what result whoever funded or did the study wanted to push. (This is why I do not trust nutrition science.)

"This is good news for people who like fruit, chocolate, coffee and wine, all of which are rich in polyphenols. Although the link between consumption of polyphenols and a reduction in the risk of metabolic syndrome had already been identified in previous studies, it had never before been verified in such a large study sample [6,378 people] and over such a long period [eight years]," said Isabela Benseñor, a co-author of the article and a professor at the University of São Paulo's Medical School (FM-USP) in Brazil.

To reiterate, though, what about other health issues? It's unlikely anyone's going to come out with a "fruit is bad for you" study, but fun-haters will definitely do everything they can to debunk the chocolate and wine part. And it's possible for an item to be good for you in some ways and bad in others. Like how aspirin has been shown to protect against heart attacks, but you have to balance that with the side effects of aspirin.

As for coffee, it's still the only acceptable thing for Americans to be addicted to, because it aids productivity and allows people to function on less sleep. Gods forbid we actually enjoy something that makes us less functional.

Detailed interviews based on questionnaires were conducted to find out about the participants' dietary habits and the frequency with which they ingested 92 polyphenol-rich foods.

This is my other caution with nutrition science: methodology is often suspect. In this case, self-reporting was used, which is notoriously flaky.

The main conclusion was that consumption of polyphenols from different foods at the highest estimated level (469 mg per day) reduced the risk of developing metabolic syndrome by 23% compared with the lowest polyphenol consumption (177 mg per day).

I'd also like to point out that this doesn't do much to show causation rather than correlation. In other words, would the same benefit be seen if someone forces themselves to consume polyphenol-rich foods when they usually don't? How do we know it's not "people less prone to metabolic syndrome prefer to consume more fruits, coffee, etc.?"

Also, while 23% is significant, I'm not sure if it's worth eating something you dislike. Like, if someone told me I'd have a 20% lower chance of prostate cancer if I drank coffee every day, well, first I'd have to know what my baseline chance is because 20% off 10% is way less significant than 20% off 80%. And then I'd have to weigh the risk reduction against simply despising the taste of coffee. Additionally, some of those foods listed can be quite expensive, and not everyone can afford them.

Anyway, like I said, I'm busy today. There's more at the link. I just wanted to throw this into the mix to show why we shouldn't just automatically believe headlines like the one in the article, whether we want to or not.
April 13, 2025 at 12:35am
April 13, 2025 at 12:35am
#1087131
I'll be traveling this week, so posts will be whenever I can find the time to make them. Like now, before I get some sleep so I can leave early in the morning.

Another older article today, an Ars Technica piece from 2019. This is significant, because clearly, the "techniques" they discuss therein didn't work to combat the misinformation and anti-science rhetoric that amped up in the following year.

    Two tactics effectively limit the spread of science denialism  Open in new Window.
Debunking the content or techniques of denialism mitigates their impact.


Does it, though? Does it really?

“Vaccines are safe and effective,” write researchers Philipp Schmid and Cornelia Betsch in a paper published in Nature Human Behavior this week.

Again... 2019.

“Humans cause global warming. Evolution theory explains the diversity and change of life.” But large numbers of people do not believe that these statements are true, with devastating effects: progress toward addressing the climate crisis is stultifyingly slow, and the US is seeing its largest measles outbreak since 2000.

I checked the statistics, and yes, the one in 2019 was even larger than the current measles outbreak... so far.

In their paper, Schmid and Betsch present some good news and some bad: rebutting misinformation reduces the ensuing level of science denialism, but not enough to completely counter the effect of the original exposure to misinformation.

If what we've seen over the past five years is a reduction, I'd hate to have seen the unmitigated disaster.

Schmid and Betsch make a point of emphasizing that science denialism is a universe away from a healthy skepticism. In fact, skepticism of existing results is what drives research to refine and overturn existing paradigms. Denialism, the authors write, is “dysfunctional” skepticism “driven by how the denier would like things to be rather than what he has evidence for.”

There's also, I think, a knowledge gap involved. If you don't know how to fly a helicopter, don't get behind the controls of one. If you think you know how to fly a helicopter because you've seen action movies, you're wrong. Similarly, if you think you know everything about vaccines because you've watched a few videos online, you're wrong. I don't know everything about vaccines, but I have the advantage of living in the same house as an epidemiologist. And usually that of recognizing good science as opposed to bad.

Schmid and Betsch focused on strategies to counter misinformation as it is being delivered during a debate, focusing on two possible approaches: correcting misinformation and laying bare the rhetorical techniques that are being used to obfuscate the truth.

Maybe part of the problem is allowing it to get to the point of debate. When you get a flat-earther up on stage discussing the shape of the planet with a... well, with anyone with brains, you're putting them on equal footing. You shouldn't do that. Flat-earth nonsense needs to be nipped in the bud, even if it does make the flat-earther feel persecuted and perversely vindicated. They can have their own platform, not one shared with scientists.

Flat-earth bullshit is only the most obvious of these types of "my ignorance is just as good as your knowledge" things, though.

For instance, in the case of vaccine denialism, a denier might argue that vaccines are not completely safe. Correcting this misinformation (which Schmid and Betsch call a “topic” rebuttal) could take the form of arguing that vaccines in fact have an excellent safety record. A “technique” rebuttal, on the other hand, would point out that demanding perfect safety is holding vaccines to an impossible standard and that no medication is 100 percent safe.

"See? It's only 99.9999% safe! Why take the chance?" Because failing to vaccinate causes more death.

The article goes into the methods used in the study, then:

But one thing seems clear: it could be better to turn up and debate a denialist than to stay away, a tactic that is sometimes advocated out of fear of legitimizing the denialism.

Which is exactly the opposite of what I just said up there. This can tell us all three things:

1) I'm not an expert, either (but I can generally spot experts);
2) I can be wrong;
3) Unlike denialists, I can admit when I'm wrong.

Still, I'm not going to debate any of these things in person. My memory is too crappy, my knowledge is too broad and not deep enough, and I'm not much of a public speaker. There's no way I could keep up with the flood of misinformation and outright lies that the denialist (of whatever) is spouting. If someone else wants to do it, someone with actual credentials and who's not going to freeze up on stage, go for it.

But the bullshit comes too fast. A lie is wiping its dick on the curtains while the truth is still struggling to get the condom on.

It's an uphill battle. Sisyphean, even, because once you push the boulder to the top of the hill, they'll roll it right back down again.

And yet, I have to try.
April 12, 2025 at 3:22am
April 12, 2025 at 3:22am
#1087073
This Wired article is fairly old, and published on my birthday, but neither of those tidbits of trivia are relevant.

    Why a Grape Turns Into a Fireball in a Microwave  Open in new Window.
Nuking a grape produces sparks of plasma, as plenty of YouTube videos document. Now physicists think they can explain how that energy builds up.


No, what's relevant is that fire is fun.

The internet is full of videos of thoughtful people setting things on fire.

See?

Here’s a perennial favorite: Cleave a grape in half, leaving a little skin connecting the two hemispheres. Blitz it in the microwave for five seconds. For one glorious moment, the grape halves will produce a fireball unfit for domestic life.

Unfortunately, you can only see it through the appliance's screen door (that screen serves the important function of keeping most of the microwaves inside the microwave), and I don't know what it might do to the unit, so don't try this with your only microwave. Or at least, don't blame me if you have to buy a new one. I'm not going to pay for it.

Physicist Stephen Bosi tried the experiment back in 2011 for the YouTube channel Veritasium, in the physics department’s break room at the University of Sydney.

What's truly impressive is that Bosi, the grape, and the microwave oven were all upside-down.

Off-camera, they discovered they had burned the interior of the physics department microwave.

What'd I tell you? I'm not responsible if you blow up the one at work, either. Still, if the last person to use it committed the grave sin of microwaving fish, this might be an improvement.

I should also note that the article contains moving pictures of the effect. These are cool, but you might hit a subscription screen. With my script blocker, I could see the text, but not the pictures.

But it turns out, even after millions of YouTube views and probably tens of scorched microwaves, no one knew exactly why the fireball forms.

As regular readers already know, this is the purpose of science.

After several summers of microwaving grape-shaped objects and simulating the microwaving of those objects, a trio of physicists in Canada may have finally figured it out.

At least they weren't upside-down. Sucks if they wanted to nuke some poutine, though.

The fireball is merely a beautiful, hot blob of loose electrons and ions known as a plasma. The most interesting science is contained in the steps leading up to the plasma, they say. The real question is how the grape got hot enough to produce the plasma in the first place.

And this is why some people think science sucks the joy out of everything. No, nerds: the fireball is the cool part. The science is merely interesting.

Their conclusions: The grape is less like an antenna and more like a trombone, though for microwaves instead of sound.

Huh. Never heard of a trombone exploding into a blaze of glorious fire, but I suppose it could happen. Better to save that fate for instruments that deserve it, like bagpipes, accordions, and mizmars.

I joke, yes, but the article explains it rather well. If you have a subscription. Or can cleverly bypass that annoying restriction.

The grape, incidentally, is the perfect size for amplifying the microwaves that your kitchen machine radiates. The appliance pushes microwaves into the two grape halves, where the waves bounce around and add constructively to focus the energy to a spot on the skin.

Not explained: if the grape is "the perfect size," how come it works for grapes of different sizes?

A common misconception is that the microwave acts on the grape from the outside in, like frozen meat defrosting, says physicist Pablo Bianucci of Concordia University, who worked on grape simulations included in the paper.

I don't know where Concordia University is, so I can't make jokes about its location. Oh, wait, I could look it up.

...

Oh, it's in SoCal. Grody.

Anyway, I didn't know people still thought microwaves heated from the outside in. We can't all be physicists, but I was under the impression that it's fairly common knowledge that the wavy EM thingies work by exciting the water molecules throughout the... whatever you put in there. That's why it's usually faster to nuke a cup of water than it is to boil it on the stove.

The work has more serious applications too, Bosi says.

Look, not everything needs to be useful for something. But when it is, that's pretty cool.

His experiments with grape balls of fire...

And there we have it, folks: the real reason I saved this article to share with all of you.

...began and ended with the 2011 YouTube video, but his curiosity did not. “I’m impressed with the scientific depth of the paper,” wrote Bosi in an email. In particular, he notes that authors came up with mathematical rules for describing the grape hotspot. They could conceivably shrink these rules to a smaller scale, to create similar hotspots in nanoparticles, for example. Scientists use heated nanoparticles to make very precise sensors or to facilitate chemical reactions, says Bianucci.

I'll take their words for it.

During all their microwaving, they noticed that two grapes placed side by side repeatedly bump into each other, back and forth. They don’t know why that happens, and they’ll be studying that next, says Bianucci.

Always something else to study. This is a good thing.

Not mentioned in the article: how in the hot hell did anyone figure out that putting a grape, cut mostly in half but still connected by a tiny thread of grape skin, into a microwave would produce a "grape ball of fire?" It's not like we eat warm grapes. Even if we did, that's still a very specific configuration.

Some mysteries, I suppose, will never be solved. And that's also a good thing.
April 11, 2025 at 9:13am
April 11, 2025 at 9:13am
#1087018
I'm more than a little pissed at Time right now because they reported the "dire wolf de-extinction" story as if it were true and not a steaming pile of bullshit. Don't know what I'm talking about? Use a search engine; I'll be damned if I'm going to give that crap any more boost by linking it.

But I'm really hoping they got the science right on this article:



"Surprising," I guess, if you're a prude. It makes me feel better to cuss, so I've always known it had health benefits (for me, not the people I'm cussing at). Still, it's good to have science backing me up. If it's true. After the "dire wolf" bullshit, I can't be sure.

Many of us try to suppress the urge to blurt out an expletive when something goes wrong.

And many of us try to hold sneezes in. That doesn't mean it's healthy.

Research has found that using profanity can have beneficial effects on people’s stress, anxiety, and depression. In fact, there are numerous potential physical, psychological, and social perks related to the power of a well-timed F-bomb.

"Social?" I guess it depends on the society.

Cursing induces what’s called hypoalgesia, or decreased sensitivity to pain. Researchers have shown that after uttering a curse word, people can keep their hands submerged in ice water for longer than if they say a more neutral word.

I get why they do the submerged in ice water thing. It's a low-risk means of inducing some level of pain in a test subject. Other kinds of pain may be unethical for scientists. But I wonder about the efficacy of low-risk pain inducement in a study such as this. For one thing, a big part of pain is the surprise. If you know you're going to get stuck with a needle at the dentist, you can control your reaction somewhat (though it's quite difficult to swear with your mouth wide open and the dentist's fingers in there).

But here’s an interesting twist: “People who swear less often get more benefit from swearing when they need it,” he says. In other words, cursing all the time zaps the words of their potency.

That's not surprising to me. I prefer to hold back the important words for when they can provide better emphasis.

Swearing aloud is associated with improvements in exercise performance, including cycling power and hand-grip strength.

This wouldn't surprise me either. I glanced at the study. Decent sample size, but restricted demographics (i.e. one of those studies that used students as swearing guinea pigs), and the control group used neutral language, presumably words such as "hit," "truck," or "bunt."

A study in the European Journal of Social Psychology found that when people wrote about a time they felt socially excluded, then repeated a swear word for two minutes, their hurt feelings and social distress were significantly lower than for people who used a neutral word.

Taken together with the findings about physical pain, this might lend more credence to the idea that physical pain and emotional pain are related in more ways than just being described with the same word.

In another study, researchers found that when drivers cursed after being refused the right of way by another driver, or when they encountered a traffic jam caused by cars that were stopped illegally, cursing helped them tamp down their anger and return to a more balanced emotional state.

I didn't look at that study. I've experienced this myself. And "cursing" in this context includes showing the offender my middle finger.

There appear to be surprising social benefits associated with the well-timed use of profanity. “Some people believe that profanity can break social taboos in a generally non-harmful way, [which] can create an informal environment in which people feel like insiders together,” says Ben Bergen, a professor of cognitive science at the University of California, San Diego, and author of...

This isn't on the same level as those other assertions. "Some people believe" is weasel words, which is why I'm not including the name of his book. I don't doubt that it does these things, but, as anyone who's been on WDC for a while can attest, cussing can also alienate some people.

Of course, it is possible to overdo it. People who swear frequently are sometimes perceived as angry, hostile, or aggressive, so there’s a potential tipping point to using profanity.

Again, I'm pretty sure that's true, but: what's the tipping point? I suspect it's different for different groups. Baptist church vs. biker bar, e.g.

The article does address this qualitatively:

It’s also important to know your audience.

Swearing etiquette may depend on the social hierarchy and power dynamics in certain situations, such as the workplace, says Jay. Just because the boss uses curse words doesn’t necessarily mean you can get away with it. (You’ll also want to modify your language around young children.)

Nah. I want young children to stay as far away from me as possible. If I cuss in public, their parents herd them away. I win. They win, too, because I have furthered their education.

Not addressed in the article: whether writing "fuck" has similar benefits to saying it. I suspect not. Clearly, further study is needed. Can I get money for being a guinea pig in that study?
April 10, 2025 at 12:42am
April 10, 2025 at 12:42am
#1086956
I'm posting early today because I have a dentist thing that will a) take all morning and b) leave me in no shape to form coherent sentences (worse than usual, I mean) in the afternoon. Speaking of posting schedule, I'll be going on a little trip next week, so blog posts will be erratically timed.

For today, though, I'll try not to make any tired old "place is in the kitchen" jokes about today's article from Gastro Obscura. No promises.

    Meet the Feminist Resistance Fighter Who Created the Modern Kitchen  Open in new Window.
Margarete Schütte-Lihotzky left an indelible mark on Austria, architecture, and how we cook.


Sexist jokes notwithstanding, this scene is set in Austria in the 1940s, and it was a central platform, in that era, of a certain political party led by a certain Austrian that women were for children, kitchen, and church. Which should be enough right there to rebel against the entire idea of rigid gender roles.

Schütte-Lihotzky had been imprisoned since 1941 for her work as a courier for the Communist Party of Austria (KPÖ), which led the resistance against the Nazi regime in her home country. While she managed to narrowly avoid a death sentence, Schütte-Lihotzky remained in jail until the end of World War II in 1945. The incarceration would forever split her life in two. On the one side were her beginnings as a precocious and successful architect spurred on by the desire to create a better life for working-class women. On the other, what she would refer to as her “second life,” as an active communist, political activist, and memoirist who was professionally shunned in Austria for her political beliefs and received her much-deserved accolades only in the final decades of her life.

I suppose it could have been worse. Some people don't get recognized until after they croak.

Schütte-Lihotzky led a remarkably long and full life, dying a few days short of her 103rd birthday in 2000. But her name remains forever connected to a space she designed when only 29 years old: the Frankfurt Kitchen, the prototype of the modern fitted kitchen.

Which is so ubiquitous in developed countries now that it's hard to imagine a time when it didn't exist.

Designed in 1926 as part of a large-scale social housing project in Frankfurt, Germany, the “Frankfurt Kitchen” introduced many of the elements we now take for granted...

So the concept of a kitchen as we know it today is just under 100 years old. That's not too surprising; 100 years ago, we were still arguing over things like the size of the Universe and what powers the Sun. Still, I'd have said "take for granite," because of the proliferation of granite countertops in kitchens and because I can't resist a gneiss play on words.

...a continuous countertop with a tiled backsplash, built-in cabinets, and drawers optimized for storage—all laid out with comfort and efficiency in mind.

Whoever put my kitchen together must have forgotten about the "optimized for storage" bit.

“She didn’t just develop a kitchen,” says Austrian architect Renate Allmayer-Beck. “It was a concept to make women’s lives easier by giving them a kitchen where they could manage more easily and have more time for themselves.”

Thus leading inexorably to women joining the workforce, which, if you think that's a bad thing, boy are you reading the wrong blog.

The article even addresses the obvious:

While the Frankfurt Kitchen was marketed as a kitchen designed for women by a woman, Schütte-Lihotzky resented the implication that her gender automatically endowed her with secret domestic knowledge, writing in her memoir that “it fed into the notions among the bourgeoisie and petite bourgeoisie at the time that women essentially work in the home at the kitchen stove.”

I vaguely remember featuring a bit back in the old blog about the invention of the automatic dishwasher, which predated the Frankfurt kitchen (I suppose that rolls off the tongue and keyboard more easily than "Schütte-Lihotzky Kitchen") by a few decades. That, too, was a woman's work. And that's the closest I'm going to get to making a "women's work" joke; you're welcome.

The Frankfurt Kitchen was efficiently laid out and compact, to save both on costs and the physical effort required to use it. Here, a woman could move from sink to stove without taking a single step. This quest for efficiency also led Schütte-Lihotzky to move the kitchen from a corner of the family room into its own space—a choice that baffled contemporary homemakers.

And then, decades later, they'd take away the wall separating the kitchen from the family room, putting it back into one big open space. I spent my childhood in a house with an open-concept kitchen/living area, and I have nothing inherently against it. What I have a problem with is all the remodeling shows that insist on that kind of layout. Not because they insist on it, but because they're thinly-veiled ads for home improvement stores, and they enable that bane of the housing market in the US: house flippers.

The article even addresses the open-concept change, if obliquely:

When the Frankfurt Kitchen came under fire from second-wave feminists in the 1970s for isolating women in the kitchens and making domestic labor invisible, the critique hit her hard.

She defended her design in her memoir. “The kitchen made people’s lives easier and contributed to women being able to work and become more economically independent from men,” she wrote. Still, she conceded, “it would be a sad state of affairs if what was progressive back then were still a paragon of progress today.”


I feel like a lot of people would defend their life's work to the last, but that quote demonstrates a willingness to keep an open mind, even later in life, and to acknowledge that nothing is ever truly completed. As they used to say, "a woman's work is never done."

There's a lot more at the link, which I found interesting because I was only vaguely aware that today's kitchen designs owed a debt to something called a "Frankfurt Kitchen," but I didn't know anything about how it came to be. I figured maybe someone else might want to know, too.
April 9, 2025 at 11:19am
April 9, 2025 at 11:19am
#1086899
I sure talk about the Moon a lot. We're coming up on another Full Moon, by some reckonings the Pink Moon, the first Full Moon after the Northern Hemisphere Spring Equinox. It's also a culturally significant Full Moon because it marks the start of Pesach, or Passover; and helps to define the timing of Easter. This will occur on Saturday, based on Eastern Standard Time.

But this article, from aeon, isn't about Moon lore or cultural observances; quite the opposite.

    How the Moon became a place  Open in new Window.
For most of history, the Moon was regarded as a mysterious and powerful object. Then scientists made it into a destination


On 25 May 1961, the US president John F Kennedy announced the Apollo programme: a mission to send humans to the Moon and return them safely to Earth within the decade.

Specifically, white American male humans, but hey, one small step and all that.

The next year, the American geologist Eugene M Shoemaker published an article on what it would take to accomplish the goal in American Scientist. It is an extraordinary document in many ways, but one part of his assessment stands out. ‘None of the detailed information necessary for the selection of sites for manned landing or bases is now available,’ Shoemaker wrote, because there were ‘less than a dozen scientists in the United States’ working on lunar mapping and geology.

I had to look it up to be sure, but yeah, this was the same guy who co-discovered Comet Shoemaker-Levy 9, the one that impacted Jupiter back in the 1990s, right around the time we coincidentally started confirming the existence of exoplanets. That's a lot of astronomy wins for a geologist, especially considering that, technically, "geology" only applies to Earth. I think that's a word it's safe to expand the definition of, though; otherwise, we'll have selenology, areology, and any number of other Greek-rooted world names attached to -ology. The problem becomes especially apparent when you consider we also have geography, geometry, and geophysics. Some sources refer to him as an astrogeologist; I'm not really picky about the wording in this case, as long as we all understand what's meant, though technically "astro-" refers to stars, not moons or planets. Being picky about that would cast doubt on "astronaut" as a concept.

Incidentally, he apparently died in a car crash in 1997, and some of his ashes got sent to the Moon with a probe that crashed into its south pole region. A fitting memorial, if you ask me.

But I digress.

The Moon is a place and a destination – but this was not always the case.

Well, it was certainly a destination for Eugene M. Shoemaker. Or part of him, anyway.

To geographers and anthropologists, ‘place’ is a useful concept. A place is a collision between human culture and physical space. People transform their physical environment, and it transforms them. People tell stories about physical spaces that make people feel a certain way about that space. And people build, adding to a space and transforming it even further.

So, this is a situation where science, technology, anthropology, folklore, mythology, linguistics, engineering, and psychology (and probably a few other ologies) meet. In other words, candy for Waltz.

Now, you might be thinking, as I did, "But science fiction treated other worlds as 'places' long before we sent white male American humans to the Moon." And you'd be right (because, of course, I was). The key is in the definition of 'place' I just quoted from the article: the Moon became a real place, as opposed to the speculative place it had in science fiction and fantasy:

Centuries ago, a major reconceptualisation took place that made it possible for many to imagine the Moon as a world in the first place. New technologies enabled early scientists to slowly begin the process of mapping the lunar surface, and to eventually weave narratives about its history. Their observations and theories laid the groundwork for others to imagine the Moon as a rich world and a possible destination.

Then, in the 1960s, the place-making practices of these scientists suddenly became practical knowledge, enabling the first visitors to arrive safely on the lunar surface.


One might argue that we lost something with that, like the folklore and mythology bits. But we gained something, too, and didn't really lose the folklore (though some of it, as folklore is wont to do, changed).

For much of history, the Moon was a mythological and mathematical object. People regarded the Moon as a deity or an abstract power and, at the same time, precisely charted its movement. It seemed to influence events around us, and it behaved in mysterious ways.

The connection between the Moon and tides was clear long before Newton explained gravity enough to demonstrate a causal relationship.

There were some who thought about trips to the Moon. Stories in religious traditions across the world tell of people travelling to the Moon. There were some thinkers before and after Aristotle who imagined that there were more worlds than just Earth. The ancient atomists discussed the possibility of worlds other than Earth, while other Greeks discussed the possibility of life on the Moon. This included Plutarch, who wrote about the Moon as both mythical and a physical object. But, to the extent that the Moon was thought about as a place, the notion was largely speculative or religious.

I sometimes wonder if, had we not had the big shiny phasey thing in the sky, our perception of space travel might have been different. The only other big thing in the sky is the Sun; all the other relatively nearby objects resolve to little more than dots: Venus, Mars, etc. I suspect that the presence of a visible disc, with discernible features even, might have served as a stepping-stone to imagining those other dots as worlds, once the telescope could start us seeing them as discs, too.

It would certainly have made mythology and folklore a lot different, not having a Moon.

The rest of the article is basically a brief (well, not so brief because it's aeon, but brief in comparison to human history) recap of our cultural relationship with the Moon. I don't really have much else to comment on, but I found it an interesting read, especially to see how our understanding has changed over time.
April 8, 2025 at 10:11am
April 8, 2025 at 10:11am
#1086823
Got this one from Time, and now it's Time to take a look at it.



"Has become?" Always has been.

Imagine walking through New York City, invisible.

I don't have to. I've done it. People bumped into me (and didn't even pick my pocket), cars didn't stop at crosswalks, and taxicabs just zoomed on by when I hailed them.

This is also known as "being in New York City."

Marilyn Monroe, one of the most recognizable women in the world, once did exactly that.

The article describes how no one recognized her until she started acting Marilyny. There's some irony (or whatever) there, because it wasn't Marilyn Monroe who (if the story is true) walked through NYC invisibly; that was Norma Jeane Mortenson. So who was being herself? Marilyn or Norma Jeane? Who is real and authentic: Superman or Clark Kent? (Yes, I know, trick question; they're both fictional.)

Her story is extreme, but her struggle is not unique. Like Marilyn, many of us learn to shape ourselves into what the world expects. Refining, editing, and performing until the act feels like the only version of us that belongs.

Well, yeah. And then you become the act. And that becomes your authentic, real, true self. This isn't news or something to be ashamed of; it's the essential process of life as a human.

Today, even authenticity is something we curate, measured not by honesty but by how well it aligns with what’s acceptable. The pressure to perform the right kind of realness has seeped into every aspect of modern life.

Oh, boo hoo hoo. "Today," my ass. We've been doing this since we figured out this newfangled "fire" shit, if not before then. I might even postulate that the pressure to fit in, to conform, to not act like but be the person your society expects was even stronger in pre-industrial times.

Authenticity was supposed to set us free. Instead, it has become something we must constantly prove. In a culture obsessed with being “real,” we curate our imperfections, filter our vulnerabilities, and even stage our most spontaneous moments online.

Who's this "we" person?

I figured out a long time ago that I needed to be someone different at work than I was for, say, my role-playing game group. The latter helped with the former.

Those who should know these things told me that people responded well to honesty and authenticity, so I learned to fake those qualities.

Instead of naturally shifting between different social roles, we now manage a single, optimized identity across multiple audiences—our family, coworkers, old friends, and strangers online.

Again, who the fuck is "we?" Not me.

The bigger, paradoxical problem is, however, that the more we strive to be real, the more we perform; and in proving our authenticity, we lose sight of who we truly are.

To me, this is like saying "No one sees how we truly look; they only see the wardrobe and hairstyle we choose." Hell, even nudists get to choose their hairdos. Who "we" are is always a performance. Eventually, the performance becomes who we are. Fake it 'til you make it, and all that.

Think back to childhood. At some point, you probably realized that certain behaviors made people like you more. Maybe you got extra praise for being responsible, so you leaned into that. Maybe you learned that cracking jokes made you popular, so you became the funny one.

Okay, now you're attacking me directly.

Psychologists call this the “False Self”—a version of you that develops to meet external expectations.

Well, far be it from me to dispute what professional psychologists say, but again, that's like saying "society expects us to wear clothing to cover one's genitals, so the only way to be authentic is to be naked."

And even then, which is more authentic: pre-shower, or post-shower? And do you comb/brush your hair? Then you're not being authentic; you're conforming to society's norms.

My point here is that despite what the article says, authenticity isn't always a good thing. Maybe your "authentic" self is a thief, and you don't want to face society's punishment for that, so you choose not to steal stuff. You're tempted, sure, but you just walk past the shinies instead of pocketing them, or restrain yourself from picking an NYC pedestrian's pocket or running off with her purse. You become not-a-thief, and that eventually becomes your true self.

Some of us are just naturally funny, but others have to work at it. The desire to work at it is just as authentic as the not-being-funny part.

What's the point of trying to improve yourself if you then get slammed for being "unauthentic?" A violent person may want to do the work to stop being violent. A pedophile may choose to deliberately avoid being around children. Is that not a good thing for everyone?

As for code-switching, are we supposed to wear the same clothes for lounging around the house, going to a gym, working, and attending a formal dinner? This is the same thing, but with personality.

Authenticity isn’t something you achieve. It’s what’s left when you stop trying. Yet, the more we chase it, the more elusive it becomes.

Well gosh, you know what that sounds exactly like, which I've harped on numerous times? That's right: happiness.

Culture shifts when enough people decide to show up as they are.

Naked with uncombed hair?

Hard pass.
April 7, 2025 at 9:25am
April 7, 2025 at 9:25am
#1086745
It's nice to be able to see through optical illusions, as this article from The Conversation describes. It would be even nicer to be able to see through lies and bullshit, but that's probably harder.



And I did find possible bullshit in this article, in addition to the slightly click-baity headline.

Optical illusions are great fun, and they fool virtually everyone. But have you ever wondered if you could train yourself to unsee these illusions?

I can usually see past the optical illusion once it's pointed out to me, or if I figure it out, but not always.

Now, it should be obvious that there are pictures at the article. They'd be a pain to reproduce here, and why bother, when I already have the link up there in the headline?

We use context to figure out what we are seeing. Something surrounded by smaller things is often quite big.

Which is why it's important to hang out with people smaller than you are. Or bigger, depending on the effect you're looking for.

How much you are affected by illusions like these depends on who you are. For example, women are more affected by the illusion than men – they see things more in context.

The article includes a link to, presumably, a study that supports this statement. I say 'presumably,' because when I checked this morning, the link wasn't working. So I can't really validate or contradict that assertion, but I do question the validity of the "they see things more in context" statement.

Young children do not see illusions at all.

The link to that study did work for me, and from what I can tell, it was about a particular subset of illusions, not "all."

The culture you grew up in also affects how much you attend to context. Research has found that east Asian perception is more holistic, taking everything into account. Western perception is more analytic, focusing on central objects.

None of which fulfills the promise of the headline.

This may also depend on environment. Japanese people typically live in urban environments. In crowded urban scenes, being able to keep track of objects relative to other objects is important.

Okay, this shit is starting to border on racism and overgeneralization. Also, the glib explanation is the sort of thing I usually find associated with evolutionary psychology, which reeks of bullshit.

However, what scientists did not know until now is whether people can learn to see illusions less intensely.

A hint came from our previous work comparing mathematical and social scientists’ judgements of illusions (we work in universities, so we sometimes study our colleagues). Social scientists, such as psychologists, see illusions more strongly.


And this is starting to sound like the same old "logical / creative" divide that people used to associate with left brain / right brain.

Despite all these individual differences, researchers have always thought that you have no choice over whether you see the illusion. Our recent research challenges this idea.

Whatever generalization they make, I can accept that there are individual differences in how strongly we see optical illusions. So this result, at least, is promising.

Radiologists train extensively, so does this make them better at seeing through illusions? We found it does. We studied 44 radiologists, compared to over 100 psychology and medical students.

And we finally get to the headline's subject, and I'm severely disappointed. 44? Seriously?

There is plenty left to find out.

I'll say.

Despite my misgivings about some of the details described, I feel like the key takeaway here is that it may be possible to train people away from seeing a particular kind of optical illusion. But it may be a better use of resources to train them to smell bullshit.
April 6, 2025 at 7:50am
April 6, 2025 at 7:50am
#1086678
Once again, Mental Floss tackles the world's most pressing questions.

    Why Do So Many Maple Syrup Bottles Have a Tiny Little Handle?  Open in new Window.
It’s not for holding, that’s for sure.


Well, this one would be pressing if anyone in the US could still afford maple syrup.

Ideally, you’d be able to hold the handle of a maple syrup container while you carry it and also while you pour the syrup onto pancakes, waffles, or whatever other foodstuff calls for it.

Good gods, how big is your maple syrup container? I usually get the ones about the size of a beer bottle, which doesn't even require a handle. Or, you know, I used to, when we were still getting stuff from Canada.

But the typical handle on a glass bottle of maple syrup is way too small and positioned too far up the bottleneck to be functional in either respect.

So, why is it there?


Why is anything nonfunctional anywhere? Decoration, tradition, or for easy identification, perhaps.

The most widely accepted explanation is that the tiny handle is a skeuomorph, meaning “an ornament or design representing a utensil or implement,” per Merriam-Webster.

I'm actually sharing this article not to complain about trade wars, but because I don't think I'd seen 'skeuomorph' before, and it's a great word.

As the article goes on to note, it's apparently pretty common in software design. They use other examples, but here on WDC, we have a bunch of them. The magnifying glass for Search, the shopping cart (or trolley) for Shop, glasses for Read & Review, the gear icon for settings, and so on. I don't do website or graphic design, so I didn't know the word.

But there are plenty of skeuomorphs that don’t involve the transition from analog to digital life, and the useless handle of a maple syrup bottle is one of them.

I'd hesitate to call it "useless," myself. Obviously, it's not useful as a handle for carrying or pouring, but, clearly, it does have a purpose: marketing.

Here’s one popular version of the origin story: The little handle harks back to the days of storing liquids in salt-glazed stoneware that often featured handles large enough to actually hold.

Moonshine distillers, take note. (And yet, the article mostly debunks that origin story, as one might expect.)

Maple syrup manufacturers had started to add little handles to their glass bottles by the early 1930s. This, apparently, was a bit of a marketing tactic. “Maple syrup companies weren’t so much retaining an old pattern of a jug as reinventing it and wanting to market their product as something nostalgic,” Canada Museum of History curator Jean-François Lozier told Reader’s Digest Canada.

Like I said.

Perhaps one day, I will again have the opportunity to purchase delicious maple syrup. When I do, I'll be looking for the skeuomorph.
April 5, 2025 at 9:46am
April 5, 2025 at 9:46am
#1086609
While Cracked ain't what it used to be (what is, though?), here, have a bite of this:

    5 Foods That Mutated Within Your Lifetime  Open in new Window.
We finally figure out what happened to jalapeños


It should go without saying that "mutated" is a bit misleading, but here I am, saying it anyway.

We know that companies keep tinkering with the recipes behind processed foods, changing nitrates or benzoates so you’ll become as addicted as possible.

Wow, that would suck, becoming addicted to food.

More basic foods, however, are more dependable.

And, of course, here's the countdown list to contradict that.

5 Brussels Sprouts

A couple decades ago, jokes on kids’ shows would keep saying something or another about a character hating Brussels sprouts.


Pretty sure it was more than a couple of decades ago. But the Brussels sprouts thing didn't stick in my memory. Broccoli did. Of course, as I got older and didn't have to eat them the way my mom overcooked them, I learned to like both. And when I got even older, I had my mind blown with the fact that they are the same species.

If you were around back then, you probably learned that Brussels sprouts tasted gross before you’d ever heard of the city of Brussels.

Having been to Brussels, I still don't know what they call them there. Sprouts, probably, or whatever the French or Dutch word for sprouts is. like how Canadian bacon is called bacon (or backbacon) in Canada, or French fries are called frites in Brussels because they're a Belgian invention, not French.

Unlike French fries, Brussels sprouts actually have a connection to Brussels. Well, not the city. It's hard to find extensive vegetable gardens in most major cities. But they were grown extensively in the surrounding countryside, or so I've heard.

Brussels sprouts used to taste bitter, but during the 1990s, we started crossbreeding them with variants that didn’t. When we were done, we’d bred the bitterness out.

There's an incident stuck in my head from several years ago, back when I did my own grocery shopping so at least six years and probably more, where I sauntered up to a supermarket checkout counter with a big bag of Brussels sprouts. The cashier started to ring me up, but then she looked me in the eye and said, "Can I ask you a question?"

"Sure."

She held up the bag o' sprouts. "How can you eat these things?"

I was rendered speechless for a moment, but retained enough presence of mind to say "With butter and garlic." Or maybe I just sputtered, and then a week or so later, lying awake at night, I finally came up with a good comeback, and edited my memory to make me look better.

Turns out there’s no moral law saying healthy stuff must taste bad.

Shhh, you can't say that in the US.

4 Pistachios

Pistachio nuts in stores used to always be red.


I don't think I ever noticed that.

Today, we instead associate pistachios with the color green, due to the light green color of the nuts and the deeper green color of the unripe shells.

I associate them with a lot of work and messy cleanup, but damn, they taste good.

3 Jalapeños

In the 1990s, the word “jalapeño” was synonymous with spicy.


Again, this is a US-oriented site. For many Americans, mayonnaise is too spicy, and anything else is way too spicy.

Today? Not so much. Maybe you’d call a habanero spicy, but jalapeños are so mild, you can eat a pile of them.

That's... not entirely true. It's actually worse than that; jalapeños have wildly varying levels of capsaicin, making it difficult to control the flavor of one's concoction when using that particular species.

Today, you might find yourself with one of the other many hotter jalapeño varieties, but there’s a good chance you’ll find yourself with TAM II or something similarly watery.

Which is why, when I want spicy peppers, I go with habanero or serrano. No, I don't use whole ghost peppers, but I do use ghost pepper sauce sometimes.

2 Sriracha Sauce

You know Sriracha sauce? Its label says that the primary ingredient is “chili,” and the chili pepper they use happens to be a type of jalapeño. At least it used to be, until some recent shenanigans.


I know it, and I sometimes use it, but my tongue refuses to pronounce it. It has no problem tasting it, though.

1 Apples

I don't think it would surprise many people to know that this iconic fruit has been selectively bred into hundreds of different varieties.

The most extreme victim of this aesthetics supremacy may be the Red Delicious apple. Today, it’s perhaps the most perfect-looking apple. It looks like it’s made of wax, and many say it tastes like it’s made of wax, too.

Nah, more like cardboard. I know what cardboard tastes like because I ate a pizza from Domino's once.

Buyers have started rebelling. If you aren’t satisfied with Red Delicious, you can try the increasingly popular Gala or Fuji apples.

On the rare occasions that I actually buy apples for eating, those are my choices, because they're tasty and they're usually available.

In summary, yeah, lots of foods have changed, and sometimes for the worse. What's remarkable isn't the change itself, but our ability to tinker with the genetics of what we eat. And we've been doing it for as long as we've been cultivating food. We can be quite clever, sometimes. But I question our collective taste.

43 Entries *Magnify*
Page of 3 20 per page   < >
Previous ... -1- 2 3 ... Next

© Copyright 2025 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://web1.writing.com/main/profile/blog/cathartes02