hypertext, words and more


  • Katy Vine for Texas Monthly writes:

    Perhaps, instead of destroying the bacteria directly, the venom’s effect is indirect, kick-starting the immune system. Bee venom studies have shown promise in combating symptoms for autoimmune diseases such as multiple sclerosis and rheumatoid arthritis. According to Justin Schmidt, an insect venom expert at the Southwestern Biological Institute, in Tucson, it’s possible that when the immune system begins attacking itself, an injection of bee venom may help by providing an alternate target—“something to chew on,” he said, “and this tends to regulate the immune system so it does what it’s supposed to be doing, which is attack toxins that are getting into your body.” While Lyme is a bacterial infection, it sometimes mimics autoimmune disorders, and so maybe, somehow, similar rules apply.

    It’s also possible that the pain of the stings plays a role. “Maybe the venom is doing something to kick off pain receptors,” he said. Anecdotal evidence suggests that other types of venom may also work this way. A brief article in the Lancet, from 1983, described a 43-year-old woman in Arizona who had MS and went into remission for two months following a scorpion sting on her right foot. An immunologist in Houston told me she was contacted by a physician experiencing progressive MS who said he’d been stung by a sea anemone and went into temporary remission.

    This piece from Texas Monthly was mind-blowing. For one I had no idea that bee venom had a use for treating Lyme disease. The viability of the treatment is still unknown. The entire ritual of using bee-stingers has an Eastern-medicine quality to it (never mind the fact it kills our precious bee friends in the process). But, whatever works for treating the pain of Lyme is a win in my book. Even if it is a placebo, pain management is hard.

  • Anne Ewbank at Atlas Obscura writes:

    Back then, it was a very different celery. Native to the Mediterranean and Middle East, wild celery has thin stalks and a bitter flavor. It was only later that farmers bred celery to have sturdy ribs and a sweeter profile. Its strong smell and dark color struck ancient Greeks as positively chthonic: that is, associated with the Underworld and death.

    As a result, celery became an essential part of burials. In ancient Greece, celery covered graves, and the dead were often crowned with it. We know this, writes classicist Robert Garland, because the first-century Greek writer Plutarch referred to celery as the most common plant used for the purpose. Historians have floated various theories as to why the dead needed to be garlanded. Perhaps they had faced life with courage, and deserved to be buried as heroes. Garland rejects this in favor of another theory: that the dead were given heroic crowns “to add dignity and lustre to the proceedings.” Other writers, such as the Roman Pliny the Elder, considered celery off-limits as an everyday food, since it was prominent at funeral banquets.

    Wow! Who would’ve thunk? Now I have an elevated level of respect for the unfortunate celery stem that swims in my next Bloody Mary. Here’s to life, death, and to celery 🌱

  • Joshua Hunt for Pacific Standard Magazine writes:

    The name of the restaurant was 注文を間違える料理店, which means “The Restaurant of Mistaken Orders.” While its chefs are young professionals, the wait staff is made up entirely of elderly people living with dementia. One of the silver-haired waitresses, who has advanced Alzheimer’s disease, occasionally forgot what she was doing there.

    “What do I do?” She asked one young couple.

    “You’re here to help us order food,” the man said.

    “Ah, yes,” she said, then laughed gleefully while covering her mouth with one hand.

    It’s striking to witness such a jovial scene surrounding an issue that people tend to resist discussing openly in Japan, where 4.6 million people are living with Alzheimer’s disease or some other form of dementia. The country’s rapidly aging population means that, by 2025, the figure will rise to 7.3 million people, or one out of every five Japanese citizens over the age of 65.

    Here’s a brief clip of the restaurant in action from TBWA\HAKUHODO, and some of the community reactions as well:

    I don’t know about you, but Dementia and Alzheimers are two of my greatest fears. I imagine, I’m not alone in that regard. The self-destructive neurodegenerative attack of the nervous system is largely caused by prions. Which are terrifying, and still a bit of a mystery.

    As such, treatment techniques are still largely hit or miss and based on the severity of the disease progression, and many of the treatments still aren’t fully understood either. For now, the best thing we can do is fund Alzheimer’s Research non-profits and support ethical treatment centers such as the Hogewey in Amsterdam, which is essentially an entire town of medical professionals devoted to caring for 150+ patients.

    Places like The Restaurant of Mistaken Orders in Japan, keep the conversation of vital treatment and research alive, and unconventional concepts like this give those suffering a reprieve from the social frustrations these sorts of neurodegenerative diseases can create. It’s a charming and lovely restaurant concept that I hope stays afloat for as long as possible. I think we can all take a page out of their playbook and bring dignity, patience and respect to everyone in our lives. Wether they are afflicted with Alzheimers or just Arthritis — a little patience goes a long way.

    Photo: @rebelvisual
  • Charlotte Graham-McLay for the New York Times:

    The firm ran the experiment — which reduced the workweek to 32 hours from 40 — in March and April this year, and asked two researchers to study the effects on staff.
    Jarrod Haar, a human resources professor at Auckland University of Technology, said employees reported a 24 percent improvement in work-life balance, and came back to work energized after their days off.

    “Supervisors said staff were more creative, their attendance was better, they were on time, and they didn’t leave early or take long breaks,” Mr. Haar said. “Their actual job performance didn’t change when doing it over four days instead of five.”

    Not surprised by the results of this interesting study. Not completely unrelated, but taking an afternoon nap can lower your risk of heart-related death! If you take a step back from the results of either study, it’s apparent that reducing the hours of working (wether that is physically or mentally exhaustive) contributes to a better life.

    Photo By: Trent Szmolnik

    Just about every city in America should be all-in on improving the quality of life of their citizens. A change to the few, impacts everyone. In other words: a rising-tide lifts all boats. It appears the New Zealand study reveals a reduction in operating costs for businesses and takes car traffic off the road:

    Noting that the company had seen lower electricity bills with 20 percent less staff in the office each day, Mr. Barnes said the change in work hours could have wider implications if more companies adopted such a strategy.

    “You’ve got 20 percent of cars off the road in rush hour; there are implications for urban design, such as smaller offices,” he said.

    But reducing the workweek can be expensive for other cities, as a similar study in Sweden shows. But it does corroborate public health results:

    The study showed that employees felt healthier, which reduced sick-leave absence, and that patient care improved, but the city won’t push ahead to make the plan permanent.

    Fascinating results. I would love to see a 4-day workweek become the norm. If you ask me, the trade-off is worth the coin spent.

  • Let’s Take a Walk

    Living in NYC, I walk a lot. Looking over my Activity Record on my iPhone, my average step count is probably run-of-the-mill for most New Yorkers.

    It should come as no surprise that the more walkable cities are also the most popular places to live. What should come as a surprise is that creativity may be a function of walking and time. By proxy, the cities that are most walkable, are likely to be the most creative. More on that later.

    Let’s back up. This piece from the New York Times was initially published in 2014, but I only came across this via Hacker News. As a sidenote, I enjoy reading HN threads such as this one I just have to share:

    I’ve been doing this ever since I can remember. But it only works if I exit the room so I always assumed it’s related to the doorway effect (the brain forgets/discards stuff when you walk through a doorway). […]

    A reply:

    L’esprit d’escalier (“staircase wit”) [Wikipedia Link]

    I love how often HN comment threads can take from one place, and lead you down another perspective or parse a metaphor less traveled. Anyways, back to the New York Times piece.

    With the enthusiastic support of her adviser, Daniel Schwartz, a professor in the Stanford Graduate School of Education, Dr. Oppezzo recruited a group of undergraduate students and set out to see if she could goose their creativity. Gathering her volunteers in a deliberately dull, unadorned room equipped with only a desk and (somewhat unusually) a treadmill, Dr. Oppezzo asked the students to sit and complete tests of creativity, which in psychological circles might involve tasks like rapidly coming up with alternative uses for common objects, such as a button. Then the participants walked on the treadmill, at an easy, self-selected pace that felt comfortable. The treadmill faced a blank wall. While walking, each student repeated the creativity tests, which required about eight minutes.

    For almost every student, creativity increased substantially when they walked. Most were able to generate about 60 percent more uses for an object, and the ideas were both “novel and appropriate.”

    Now that’s something I can get into! But, if you’re like me, it’s not like we can all work and walk simultaneously. Let’s be realistic here. What about walking then working? Or, vice versa for that matter.

    So Dr. Oppezzo next tested whether the effects lingered after a walk had ended. She had another group of students sit for two consecutive sessions of test-taking and subsequently walk for about eight minutes while tossing out ideas for object re-use, then sit and repeat the test.

    Again, walking markedly improved people’s ability to generate creative ideas, even when they sat down after the walk. In that case, the volunteers who had walked produced significantly more and subjectively better ideas than in their pre-exercise testing period.

    Not bad. Not bad at all. Measuring subjective results like creativity is obviously challenging. I’ll take these 2014 results with a grain of salt for sure.

    However, I’m reminded of Walter Isaacson’s biography on Steve Jobs. Jobs insisted on so many walking meetings, walk-breaks and walking brainstorms. Sure, Jobs was a colossal nutbag asshole who regularly destroyed interpersonal relationships at the behest of Apple, shareholders and ultimately his own demons. But his creativity, and persistence to inspire those around him to take the lead was insurmountably, undeniably a trademark many hope to emulate even to this day. The fruits of the labors then have led Apple to where they are now. 

    trends.embed.renderExploreWidget(“TIMESERIES”, {“comparisonItem”:[{“keyword”:”walkable cities”,”geo”:”US”,”time”:”2008-08-06 2018-09-06″},{“keyword”:”best places to move to”,”geo”:”US”,”time”:”2008-08-06 2018-09-06″}],”category”:0,”property”:””}, {“exploreQuery”:”date=2008-08-06%202018-09-06&geo=US&q=walkable%20cities,best%20places%20to%20move%20to”,”guestPath”:””});

    If the Jobsian allusion doesn’t convince you, perhaps this will — New York City, Portland, Seattle, San Francisco and a handful of others are listed as the most creative cities to live in the US. What do most of the these cities have in common? Considerable mass transit infrastructure, notable walkability scores, and growing populations. Sure none of those links are authorities on creativity. But one truth is evident. People are moving to walkable cities, or at the very least want to move to walkable cities.

    The next time you’re in a creative rut, my recommendation is to go take a walk. That, or move to a walkable city. 🙃

  • Researchers combined two forms of photography that drastically reduces the destruction of cells:

    The current microscope fills a 10-foot-long table. “It’s a bit of a Frankenstein’s monster right now,” says Betzig, who is moving to the University of California, Berkeley, in the fall. His team is working on a next-generation version that should fit on a small desk at a cost within the reach of individual labs. The first such instrument will go to Janelia’s Advanced Imaging Center, where scientists from around the world can apply to use it. Plans that scientists can use to create their own microscopes will also be made freely available. Ultimately, Betzig hopes that the adaptive optical version of the lattice microscope will be commercialized, as was the base lattice instrument before it. That could bring adaptive optics into the mainstream.

    “If you really want to understand the cell in vivo, and image it with the quality possible in vitro, this is the price of admission,” he says.

    This little guy wiggles and stretches like a piece of gum. I hope your two-dimensional perception of cells has been completely shattered. I’m reminded of the white blood cell chasing bacteria. The white blood cell footage was taken underneath glass slides which is why most cell footage appear to be cross-sections or two-dimensional.

    Further Reading: