Skip Navigation
Jump
'Completely unbelievable': US pilots say 'traumatized' by intensity of Yemen retaliatory operations
  • Good effort on trying to represent your country’s perspective but it’s not holding up to scrutiny.

    You came over here bro. Pathetic behaviour.

    0
  • Jump
    Bringing the power of books to the fediverse
  • If they tied a bookwyrm comments section to an ISBN number for example then anybody/site could easily have it embedded to make it a universal tool rather than specifically connected to a piracy site.

    2
  • >"I have the map and the key to the mountain that was used in the film in a frame," he notes. "And I have Thorin's sword and his oaken shield. It's on my bookshelf!"

    Eleven years ago, Tolkienites rejoiced as The Hobbit: An Unexpected Journey landed in UK cinemas. With Lord of the Rings director, Peter Jackson, at the helm, a legion of actors including Martin Freeman, Ian McKellen, Cate Blanchett, and Orlando Bloom signed on to star.

    Joining them, British actor Richard Armitage won the role of Thorin Oakenshield – the legendary King of Durin's folk. Determined to reclaim the Lonely Mountain from Smaug, and secure the coveted Arkenstone, Thorin's redemptive story of greed made him one of the most interesting characters in the trilogy.

    Reflecting on The Hobbit's enduring legacy and the profound effect that the franchise had on him, Richard, 52, spoke exclusively to HELLO! about his time on set.

    On why the role of Thorin was so special to him, Richard explained: "It had an impact on me because I think The Hobbit was one of the first books I ever read where I really allowed my imagination to engage.

    "I was completely absorbed by Tolkien. Then I found Lord of the Rings and I think it was where my early feelers were going towards being an actor, but I didn't realise it at the time," he continued.

    "So, when I came to playing Thorin Oakenshield as a 40-year-old, I was retracing my steps right back to being an eight-year-old in school and finding that book for the first time. So, it was just such a massive thing for me."

    As for his time on set, Richard revealed a particularly poignant memory from day one of production.

    "One of the fondest memories I had was on the very first day of shooting when Peter Jackson blessed his new sound stage with a Māori Haka. I had to speak Māori to the crew because they saw my character as the King of the Dwarves," he tells HELLO!.

    "And so they asked me to make this speech in Māori and the door was lifted and the sun was rising across the floor and it was incredibly moving. It was a really special moment."

    After wrapping the trilogy with The Hobbit: The Battle of the Five Armies (2014), Richard was able to take home a number of his most treasured possessions from the set, which he continues to cherish.

    "I have the map and the key to the mountain that was used in the film in a frame," he notes. "And I have Thorin's sword and his oaken shield. It's on my bookshelf!"

    After the success of the Lord of the Rings and The Hobbit, Tolkeinites have since entered the Rings of Power era, following the release of Amazon Prime's high fantasy series in 2022.

    With the show renewed for a second season, naturally, we had to ask if Richard would be interested in a role of some sort. "I mean, I would love to, but I think it's very hard to do that. I'd have to be a different character because you couldn't bring Peter Jackson's version of Thorin Oakenshield into somebody else's. But I love the story," he said.

    3
    Jump
    Good Omens Renewed for a 3rd and Final Season on Prime Video
  • I enjoyed the first one. Haven't got round to looking at the second one. Can't remember why I decided not to watch when it was released...

    1
  • Jump
    Epub Site
  • Yeah, it's silly and odd and likely done to push customers towards formats that they have greater control over.

    Those epubs that aren't really epubs, randomly disallowing azw3 files (that they support officially!!!) from being downloaded directly from the kindle's built in browser and other restrictive behaviour are part of this. That's why I'm eventually looking to enable epubs on kindle once the people at mobileread find a way to do it. Apparently calibre can be set up to send files too via email so that's another option.

    1
  • Jump
    Epub Site
  • They're not though. They only do over the cloud conversions from epub to an amazon proprietary format, that can make the covers or formatting go awry.

    1
  • Jump
    17kb image still too big for community icon?
  • It's been disabled for now because of CSAM spam.

    https://feddit.nl/post/7266922?scrollToComments=true

    If the community is on another server, I recommend using an alt from yet another server, making the alt a mod and then add it that way? It worked for me.

    4
  • Jump
    ‘I left the cinema, walked home and announced I was moving’: films that made people emigrate | 'Middle-earth does exist'
  • I remember the two boys from the Youtube channel BoyBoy(?) speaking on Trash Taste about going to North Korea. They said it was pretty easy and apparently the simplest process they'd had. Maybe Australia is strict with that sort of thing? The only issues they had thereafter was going to the US apparently. US border people were apparently mad that they'd visited Korea for some reason.

    I still need to watch their actual video of the trip: We Went To North Korea To Get A Haircut

    1
  • https:// archive.is /eo7pO

    > I was binge-watching the DVDs with my wife, Sarah, when it hit me: Middle-earth does exist, and I don’t need a portal. I can fly there in 23 hours. I turned to Sarah and said, “Shall we move to New Zealand?” One of the many things I love about my wife is that she listens to my madder ideas with a careful seriousness. Six months later we were in Auckland.

    This has strong Bilbo Baggins vibes lol.

    5
    Jump
    Bringing the power of books to the fediverse
  • I'm envisioning Bookwyrm behaving as a comments section for anna's archive (possibly all/any decentralised book repositary), but they'd be reviews instead. I'm reminded of discus or facebook that you often get embedded on certain sites.

    8
  • Jump
    Nothing to see here
  • Mate, Palestinians just happen to exist and want to thrive. Stop ascribing some violent fantasy that never was to those poor people.

    1
  • Jump
    Disney CEO Bob Iger says company's movies have been too focused on messaging
  • You sound sarcastic lol. Maybe its because I follow a lot of artists who have similar concerns.

    1
  • www.denofgeek.com What The Hobbit Animated Movie Did Better Than the Peter Jackson Trilogy

    While aimed at a younger audience, Rankin/Bass' animated version of The Hobbit gets to the true meaning of the story better than Peter Jackson's trilogy, and in only one-sixth the time!

    What The Hobbit Animated Movie Did Better Than the Peter Jackson Trilogy

    The animated adaptations of The Hobbit and The Lord of the Rings from the 1970s and 1980s have a bit of a bad reputation these days, but these are not entirely deserved. In particular, Arthur Rankin Jr. and Jules Bass’ 1977 TV movie of The Hobbit, with a screenplay by Romeo Miller, gets a lot of things right that Peter Jackson’s three-part live-action film adaptation did not.

    The most obvious advantage that the animated version has over the live-action films is its length. The fact that the live-action movies are too long is pretty well-established, but by way of a reminder, the book of The Hobbit is about 300 pages long, with slight variations in each edition. Other books of similar length that have been adapted into films include Jane Austen’s Pride and Prejudice, Emma Donoghue’s Room, John Green’s The Fault in Our Stars, and Harper Lee’s To Kill a Mockingbird. One thing all of these have in common, is that they were adapted into one single film, two to three hours long. Pride and Prejudice has also been adapted into a six-hour miniseries by the BBC, but none of them have been stretched out to just under eight hours, which is the combined length of the theatrical cuts of the three live-action Hobbit movies. (They’re just under nine hours if you are watching the extended editions though.)

    The Rankin/Bass version of The Hobbit, on the other hand, is a mere one hour and 17 minutes, which you could almost argue is actually too short. The introductions of Elrond—who has an inexplicable crown of stars around his head for no apparent reason—and Beorn, for example, could have done with a little more room to breathe. But for a fairly slight story, a runtime that is a little too short feels like an improvement on a runtime that is far too long.

    One thing both versions “get right,” that is, they do it really well, is the music, but the Rankin/Bass film uses music in a different way to the live-action movies. In Jackson’s 2012 film The Hobbit: An Unexpected Journey, Howard Shore’s “Far Over the Misty Mountains Cold,” performed by Richard Armitage and the other dwarves in an incredibly evocative basso profundo voice, is a thing of beauty. The Rankin/Bass The Hobbit also features a musical setting for the same song from the book, and although it lacks the power of that incredible bass voice, it’s a good piece of music in its own right.

    But the Rankin/Bass movie doesn’t stop there; it’s actually a musical with relatively short songs being peppered throughout the story. This is a completely valid choice too. Author J.R.R. Tolkien’s books are full of songs, and nearly all of the songs that appear in the film are samplings of Tolkien’s own songs from the book using his lyrics. The only exception is the theme song, “The Greatest Adventure,” which is a complete original.

    Making the film as a musical also fits with the overall tone viewers would have expected from Rankin/Bass. The studio was known for its holiday specials—made-for-television, animated or stop-motion films that usually aired around Christmas time. Rudolph the Red-Nosed Reindeer (1964) and Frosty the Snowman (1969) had already become holiday staples by the time of The Hobbit. Both heavily featured music and songs, and Rudolph was a musical feature with several different songs included throughout the story. A musical with short songs appearing frequently is something audiences would expect from the Rankin and Bass studio. And they also expected the studio to produce animations aimed at a “family” audience, primarily children. This is the biggest thing Rankin and Bass got right and that the Jackson movies get wrong: The Hobbit is a story for children.

    When Tolkien originally imagined The Hobbit in the 1930s, it was as a story for his own children, and was not originally connected to the wider mythology of Middle-earth. It was only as time went on that the story got drawn into his bigger mythmaking project. And while The Lord of the Rings is clearly a novel aimed at an adult readership, The Hobbit is equally clearly intended for a younger audience. Bookshops these days generally shelve it with Middle Grade fiction aimed at children aged roughly eight to 12, not far from Tolkien’s friend C.S. Lewis and his Narnia books (which Tolkien did not like, and probably would not appreciate seeing next to his own work!).

    In fairness to Peter Jackson, Tolkien did come to regret the tone and style of The Hobbit. This was partly because it made it stick out like a sore thumb next to his other writing about Middle-earth, but also because Tolkien came to believe passionately that children should not be talked (or written) down to, and that children’s literature did not require some kind of special, slightly silly tone. In a letter to W. H. Auden in 1955, Tolkien said of The Hobbit: “It was unhappily really meant, as far as I was conscious, as a ‘children’s story’, and as I had not learned sense then… it has some of the sillinesses of manner caught unthinkingly from the kind of stuff I had had served to me… I deeply regret them. So do intelligent children.”

    There’s an argument to be made, therefore, that The Hobbit should be transformed into something with a darker, more adult tone in an adaptation. Jackson probably felt he had little choice in the matter anyway since his live-action Hobbit movies were prequels to his live-action The Lord of the Rings movies—and those, as is appropriate to The Lord of the Rings, have a tone of high epic fantasy with an intended audience of adults and older teenagers.

    But Tolkien regretted the tone and style of The Hobbit, not because he regretted writing it for children, but because he felt that writing for children should not engage in “sillinesses of manner.” It is still a story intended primarily for children, and while of course film adaptations have to make changes, the Rankin and Bass film feels more like it captures the spirit of The Hobbit because it is aimed primarily at children. There are no lewd jokes, the scary sequences are kept at an appropriate level, and of course, the runtime will not test the patience of elementary school aged children too much.

    One of the main ways Rankin and Bass make it clear that this is a film intended for children and their families is by deliberately echoing aspects of Disney’s animated films. The decision to make the film a musical is one obvious similarity with Disney’s animated fairy tales, but there are others as well. The similarity in the character design of the dwarves in The Hobbit to the dwarfs from Disney’s 1937 classic Snow White and the Seven Dwarfs is not a coincidence. (By the way, if anyone is wondering, Tolkien was quite particular about the fact that his imaginary creatures were dwarves, as opposed to dwarfs. In his Author’s Note at the beginning of The Hobbit, Tolkien explained that “in English the only correct plural of dwarf is dwarfs, and the adjective is dwarfish. In this story dwarves and dwarvish are used, but only when speaking of the ancient people to whom Thorin Oakenshield and his companions belonged.”)

    Join our mailing list

    Get the best of Den of Geek delivered right to your inbox!

    To cement the Disney-like feel, the film opens on an image of a big, hard-bound and illustrated book, just like Disney’s Snow White, Pinocchio (1940), Cinderella (1950), Sleeping Beauty (1959), The Sword in the Stone (1963), The Jungle Book (1967), and Robin Hood (1973). Interestingly, Rankin/Bass’ The Hobbit opens on an image of the book, by J.R.R. Tolkien, with the author and the famous first line (“In a hole in the ground there lived a hobbit”) clearly visible. The end of the movie, on the other hand, shows Bilbo’s in-universe Red Book, titled There and Back Again: A Hobbit’s Holiday, before finishing on an image of the One Ring, glinting in a glass case on Bilbo’s mantelpiece. Both clearly parallel the Disney trope, especially the opening, which places the story firmly in a fictional, “fairtytale” universe.

    There are of course some things the Jackson movies got right that the Rankin/Bass version did not. One of the more inexplicable decisions made for the animated movie was to increase the body count of named characters in the climactic Battle of the Five Armies. In Tolkien’s novel, and in Jackson’s film, the only three members of the Company to die are Thorin, Kili, and Fili. Rankin and Bass, however, kill off seven of the Dwarves, only even naming Thorin and Bombur, both of whom die on screen. We can assume that Kili and Fili were among the seven killed and that Balin survived (since he has to go and die in Moria sometime between The Hobbit and The Lord of the Rings), but it is a mystifying decision, especially since films aimed at children do not usually increase the body count.

    The Jackson movies also include some fantastic sequences that, taken on their own, are perfect screen adaptations of scenes from the book. Bilbo’s riddle-game in the dark with Gollum under the Misty Mountains in An Unexpected Journey and his verbal sparring with Smaug in The Desolation of Smaug are near perfect, helped by fantastic performances from Martin Freeman as Bilbo, Andy Serkis as Gollum, and Benedict Cumberbatch as Smaug. The casting is all around pretty perfect, especially Freeman, and while Legolas’ added storyline was controversial, the character really is supposed to be the son of Thranduil, the King of Mirkwood, and it’s rather nice to see him slotted in there, even if the role he is given is not to everyone’s taste.

    Overall though, the Rankin/Bass animated adaptation of The Hobbit, whatever its flaws, feels like it captures a bit more of the feel of the book, even if it leans quite heavily toward a more Disney-like tone. And if you have young children, you are almost certainly better off trying to show them this version instead of the live-action films. That is if you are too impatient to wait until they are old enough to read the book!

    0
    Jump
    Blood is thicker than water
  • We're all descendants of some squirrel-monkey-mammal creature!

    1
  • Jump
    The state of the discourse.
  • It's not worthless, there's 500 million dollars worth of gas north of Gaza that Israel wants to secure. They're already stealing it and have been for years. And the 75 year long occupation must end of course.

    6
  • Jump
    Nepal decides to ban TikTok
  • I would have thought this was common knowledge. I suspect these redditors just don’t put any effort into recall or thinking in general.

    2
  • Jump
    Nepal decides to ban TikTok
  • No you dimwit. I read the papers. Normal people do do that. Dimwit.

    -3
  • Jump
    Nepal decides to ban TikTok
  • I would have thought this was common knowledge. I suspect these redditors just don't put any effort into recall or thinking in general.

    -3
  • Jump
    TX school bans trans boy from playing "Oklahoma!" male lead, recasts with cisgender male student
  • Well this will fix the various social crises in that country for sure.

    16
  • Jump
    Nepal decides to ban TikTok
  • Will they ban Facebook which facilitated a genocide?

    edit: there are many a genocide lover on lemmy sadly.

    -7
  • Jump
    Rinse and Repeat
  • Gay rights are human rights.

    -1
  • www.scotsman.com Witch-hunting in 17th-century Scotland was so well paid that it attracted some blatant fakers – Susan Morrison

    A witch-hunter nicknamed ‘The Bloody Juglar’ appears to have used a retractable needle to prick his victims without drawing blood, while another responsible for the deaths of many innocent women turned out to be a woman herself

    cross-posted from: https://lemm.ee/post/12865151

    Witch-hunting in 17th-century Scotland was so well paid that it attracted some blatant fakers – Susan Morrison

    > A witch-hunter nicknamed ‘The Bloody Juglar’ appears to have used a retractable needle to prick his victims without drawing blood, while another responsible for the deaths of many innocent women turned out to be a woman herself

    ***

    At Spynie Palace in 1662, John Innes of Leuchars had a serious problem on his hands. Local people were complaining to him about milkless cows, shrivelling crops and dying children. Pretty obvious that a witch was on the loose. As the local law enforcement thereabouts, John was expected to do something, but witch-hunting was not in Mr Innes’s skill set.

    It must have been a relief when a slight young man almost magically appeared in front of him: John Dickson’s the name, and witch-hunting’s the game. Bags of experience. Happy to sort the problem out. Possibly dropped the name of superstar witch-hunter John Kincaid into the conversation, a Tranent man with a fearsome reputation as Scotland's most fearsome witch pricker or ‘brodder’.

    The Scots didn't do witch-ducking. We went for the needle. The Devil gave his followers marks somewhere on their bodies. Where the Devil left his mark, there would be no blood, and no pain. Kincaid and his like would use the needle to ‘prick’ the accused. The words prick and needle are misleading. This needle was no dainty thing to be lost easily in a haystack. These were more like hefty great crochet hooks. The ‘pricking’ was more of a violent slam into the body.

    The mark could be anywhere. The accused were stripped and shaved, and the needle plunged in. Some victims didn’t move, scream or bleed – the mark had been found. Possibly they couldn’t move. They may have been in deep shock. These were pious times.

    Women rarely left home without covering their heads, now they stood publicly naked, shaved and exhausted. There may well have been little or no bleeding, if the needle hit a part of the body with a poor blood supply. Or perhaps the needle was retractable.

    There are clues to such trickery. In the late 17th century, a witch-hunter nicknamed “The Bloody Juglar” turned up in Berwick-upon-Tweed. Pretty quickly his trusty needle pricked a victim and drew no blood. A witch, ready for trial and execution. Hold up, said Colonel Fenwick, the town’s military governor. He called in the mayor and the magistrates. He was worried that this evidence was falsely procured. He had his suspicions about that needle.

    Why not get The Bloody Juglar to do the pricking again, but with a council-provided needle? Our boy baulked – “by no means would he be induced unto”. To the good people of Berwick, this “was a sufficient Discovery of Knavery”. The Juglar was busted.

    John Kincaid may have been a knave, but between 1649 and 1662 he rampaged freely. It was lucrative. He pocketed £6 for a discovery of a witch at Burntcastle estate. They chucked in another £3 to cover the booze bill for him and his manservant.

    The year 1659 was a busy one. Kincaid seems to have pricked profitably in East Lothian, where 18 accused witches were executed. In 1661, Forfar was so chuffed with his efforts that they gave him the freedom of the burgh.

    Perhaps young John Dickson was inspired by Kincaid. Seemed a good trade for a lad, finding God's enemies and being very well paid for it, too. John headed north, fetched up at Spynie Palace and appeared before the harassed Innes, who wasted no time in signing up his new witch-hunter to an exclusive contract.

    John was on a good retainer with performance-related bonuses, six shillings a day expenses plus £6 per witch caught. In no time at all, our man on the make had two servants and a very fancy horse. He was on-call and carried out witch-pricking in Elgin, Forres, Inverness and Tain. He possibly pricked Isobel Goudie, Scotland’s most famous witch.

    He had a particular take on the procedure. Folk called him the Pricker “because of his use of a long brasse pin”. He had his victims stripped naked, then the “spell spot was seen and discovered. After rubbing over the whole body with his palms.” In a vicious witch-hunt/clan war in Wardlaw on the banks of Loch Ness, 14 women and one man were treated so savagely under John’s direct supervision that some of them died.

    Our boy was on a roll, until he did something stupid. He pricked a man named John Hay, a former messenger to the Privy Council. Now, this was not a man to mess with. He had connections. He wrote to Edinburgh complaining in an incredibly civil servant manner, denouncing the witch-pricker who worked on his case as a “cheating fellow” who carried out the torture without a licence. Even witch-hunters need the correct paperwork.

    The Privy Council in Edinburgh agreed. They called the maverick Mr Dickson in for a word. And they made a terrible discovery: John Dickson was a woman. Her name was Christian Caddell, and she came from Fife. Oh, she could tell a witch, no doubt about it. She claimed she spotted them by looking into their eyes and seeing an upside-down cross.

    Of course, this was not the scientifically accepted manner of witch-finding. A needle must be used. And, obviously, you needed to be a man.

    Christian stood trial, not for fake witch hunting, torturing or even for those murderous deaths, but for wearing men’s clothing. She was sentenced to transportation, and on May 6 she sailed from the port of Leith on the ship Mary, bound for Barbados.

    On the day she left Scotland, Isobel Elder and Isabel Simson, pricked by John Dickson, aka Christian Caddel, were burned in Forres. Just because you were discovered to be a witch in the wrong way didn’t mean to say you were innocent. They were the last two victims of the cross-dressing counterfeit witch-pricker.

    4

    Witch-hunting in 17th-century Scotland was so well paid that it attracted some blatant fakers – Susan Morrison

    cross-posted from: https://lemm.ee/post/12865151

    Witch-hunting in 17th-century Scotland was so well paid that it attracted some blatant fakers – Susan Morrison

    > A witch-hunter nicknamed ‘The Bloody Juglar’ appears to have used a retractable needle to prick his victims without drawing blood, while another responsible for the deaths of many innocent women turned out to be a woman herself

    ***

    At Spynie Palace in 1662, John Innes of Leuchars had a serious problem on his hands. Local people were complaining to him about milkless cows, shrivelling crops and dying children. Pretty obvious that a witch was on the loose. As the local law enforcement thereabouts, John was expected to do something, but witch-hunting was not in Mr Innes’s skill set.

    It must have been a relief when a slight young man almost magically appeared in front of him: John Dickson’s the name, and witch-hunting’s the game. Bags of experience. Happy to sort the problem out. Possibly dropped the name of superstar witch-hunter John Kincaid into the conversation, a Tranent man with a fearsome reputation as Scotland's most fearsome witch pricker or ‘brodder’.

    The Scots didn't do witch-ducking. We went for the needle. The Devil gave his followers marks somewhere on their bodies. Where the Devil left his mark, there would be no blood, and no pain. Kincaid and his like would use the needle to ‘prick’ the accused. The words prick and needle are misleading. This needle was no dainty thing to be lost easily in a haystack. These were more like hefty great crochet hooks. The ‘pricking’ was more of a violent slam into the body.

    The mark could be anywhere. The accused were stripped and shaved, and the needle plunged in. Some victims didn’t move, scream or bleed – the mark had been found. Possibly they couldn’t move. They may have been in deep shock. These were pious times.

    Women rarely left home without covering their heads, now they stood publicly naked, shaved and exhausted. There may well have been little or no bleeding, if the needle hit a part of the body with a poor blood supply. Or perhaps the needle was retractable.

    There are clues to such trickery. In the late 17th century, a witch-hunter nicknamed “The Bloody Juglar” turned up in Berwick-upon-Tweed. Pretty quickly his trusty needle pricked a victim and drew no blood. A witch, ready for trial and execution. Hold up, said Colonel Fenwick, the town’s military governor. He called in the mayor and the magistrates. He was worried that this evidence was falsely procured. He had his suspicions about that needle.

    Why not get The Bloody Juglar to do the pricking again, but with a council-provided needle? Our boy baulked – “by no means would he be induced unto”. To the good people of Berwick, this “was a sufficient Discovery of Knavery”. The Juglar was busted.

    John Kincaid may have been a knave, but between 1649 and 1662 he rampaged freely. It was lucrative. He pocketed £6 for a discovery of a witch at Burntcastle estate. They chucked in another £3 to cover the booze bill for him and his manservant.

    The year 1659 was a busy one. Kincaid seems to have pricked profitably in East Lothian, where 18 accused witches were executed. In 1661, Forfar was so chuffed with his efforts that they gave him the freedom of the burgh.

    Perhaps young John Dickson was inspired by Kincaid. Seemed a good trade for a lad, finding God's enemies and being very well paid for it, too. John headed north, fetched up at Spynie Palace and appeared before the harassed Innes, who wasted no time in signing up his new witch-hunter to an exclusive contract.

    John was on a good retainer with performance-related bonuses, six shillings a day expenses plus £6 per witch caught. In no time at all, our man on the make had two servants and a very fancy horse. He was on-call and carried out witch-pricking in Elgin, Forres, Inverness and Tain. He possibly pricked Isobel Goudie, Scotland’s most famous witch.

    He had a particular take on the procedure. Folk called him the Pricker “because of his use of a long brasse pin”. He had his victims stripped naked, then the “spell spot was seen and discovered. After rubbing over the whole body with his palms.” In a vicious witch-hunt/clan war in Wardlaw on the banks of Loch Ness, 14 women and one man were treated so savagely under John’s direct supervision that some of them died.

    Our boy was on a roll, until he did something stupid. He pricked a man named John Hay, a former messenger to the Privy Council. Now, this was not a man to mess with. He had connections. He wrote to Edinburgh complaining in an incredibly civil servant manner, denouncing the witch-pricker who worked on his case as a “cheating fellow” who carried out the torture without a licence. Even witch-hunters need the correct paperwork.

    The Privy Council in Edinburgh agreed. They called the maverick Mr Dickson in for a word. And they made a terrible discovery: John Dickson was a woman. Her name was Christian Caddell, and she came from Fife. Oh, she could tell a witch, no doubt about it. She claimed she spotted them by looking into their eyes and seeing an upside-down cross.

    Of course, this was not the scientifically accepted manner of witch-finding. A needle must be used. And, obviously, you needed to be a man.

    Christian stood trial, not for fake witch hunting, torturing or even for those murderous deaths, but for wearing men’s clothing. She was sentenced to transportation, and on May 6 she sailed from the port of Leith on the ship Mary, bound for Barbados.

    On the day she left Scotland, Isobel Elder and Isabel Simson, pricked by John Dickson, aka Christian Caddel, were burned in Forres. Just because you were discovered to be a witch in the wrong way didn’t mean to say you were innocent. They were the last two victims of the cross-dressing counterfeit witch-pricker.

    0

    Witch-hunting in 17th-century Scotland was so well paid that it attracted some blatant fakers – Susan Morrison

    www.scotsman.com Witch-hunting in 17th-century Scotland was so well paid that it attracted some blatant fakers – Susan Morrison

    A witch-hunter nicknamed ‘The Bloody Juglar’ appears to have used a retractable needle to prick his victims without drawing blood, while another responsible for the deaths of many innocent women turned out to be a woman herself

    Witch-hunting in 17th-century Scotland was so well paid that it attracted some blatant fakers – Susan Morrison

    > A witch-hunter nicknamed ‘The Bloody Juglar’ appears to have used a retractable needle to prick his victims without drawing blood, while another responsible for the deaths of many innocent women turned out to be a woman herself

    ***

    At Spynie Palace in 1662, John Innes of Leuchars had a serious problem on his hands. Local people were complaining to him about milkless cows, shrivelling crops and dying children. Pretty obvious that a witch was on the loose. As the local law enforcement thereabouts, John was expected to do something, but witch-hunting was not in Mr Innes’s skill set.

    It must have been a relief when a slight young man almost magically appeared in front of him: John Dickson’s the name, and witch-hunting’s the game. Bags of experience. Happy to sort the problem out. Possibly dropped the name of superstar witch-hunter John Kincaid into the conversation, a Tranent man with a fearsome reputation as Scotland's most fearsome witch pricker or ‘brodder’.

    The Scots didn't do witch-ducking. We went for the needle. The Devil gave his followers marks somewhere on their bodies. Where the Devil left his mark, there would be no blood, and no pain. Kincaid and his like would use the needle to ‘prick’ the accused. The words prick and needle are misleading. This needle was no dainty thing to be lost easily in a haystack. These were more like hefty great crochet hooks. The ‘pricking’ was more of a violent slam into the body.

    The mark could be anywhere. The accused were stripped and shaved, and the needle plunged in. Some victims didn’t move, scream or bleed – the mark had been found. Possibly they couldn’t move. They may have been in deep shock. These were pious times.

    Women rarely left home without covering their heads, now they stood publicly naked, shaved and exhausted. There may well have been little or no bleeding, if the needle hit a part of the body with a poor blood supply. Or perhaps the needle was retractable.

    There are clues to such trickery. In the late 17th century, a witch-hunter nicknamed “The Bloody Juglar” turned up in Berwick-upon-Tweed. Pretty quickly his trusty needle pricked a victim and drew no blood. A witch, ready for trial and execution. Hold up, said Colonel Fenwick, the town’s military governor. He called in the mayor and the magistrates. He was worried that this evidence was falsely procured. He had his suspicions about that needle.

    Why not get The Bloody Juglar to do the pricking again, but with a council-provided needle? Our boy baulked – “by no means would he be induced unto”. To the good people of Berwick, this “was a sufficient Discovery of Knavery”. The Juglar was busted.

    John Kincaid may have been a knave, but between 1649 and 1662 he rampaged freely. It was lucrative. He pocketed £6 for a discovery of a witch at Burntcastle estate. They chucked in another £3 to cover the booze bill for him and his manservant.

    The year 1659 was a busy one. Kincaid seems to have pricked profitably in East Lothian, where 18 accused witches were executed. In 1661, Forfar was so chuffed with his efforts that they gave him the freedom of the burgh.

    Perhaps young John Dickson was inspired by Kincaid. Seemed a good trade for a lad, finding God's enemies and being very well paid for it, too. John headed north, fetched up at Spynie Palace and appeared before the harassed Innes, who wasted no time in signing up his new witch-hunter to an exclusive contract.

    John was on a good retainer with performance-related bonuses, six shillings a day expenses plus £6 per witch caught. In no time at all, our man on the make had two servants and a very fancy horse. He was on-call and carried out witch-pricking in Elgin, Forres, Inverness and Tain. He possibly pricked Isobel Goudie, Scotland’s most famous witch.

    He had a particular take on the procedure. Folk called him the Pricker “because of his use of a long brasse pin”. He had his victims stripped naked, then the “spell spot was seen and discovered. After rubbing over the whole body with his palms.” In a vicious witch-hunt/clan war in Wardlaw on the banks of Loch Ness, 14 women and one man were treated so savagely under John’s direct supervision that some of them died.

    Our boy was on a roll, until he did something stupid. He pricked a man named John Hay, a former messenger to the Privy Council. Now, this was not a man to mess with. He had connections. He wrote to Edinburgh complaining in an incredibly civil servant manner, denouncing the witch-pricker who worked on his case as a “cheating fellow” who carried out the torture without a licence. Even witch-hunters need the correct paperwork.

    The Privy Council in Edinburgh agreed. They called the maverick Mr Dickson in for a word. And they made a terrible discovery: John Dickson was a woman. Her name was Christian Caddell, and she came from Fife. Oh, she could tell a witch, no doubt about it. She claimed she spotted them by looking into their eyes and seeing an upside-down cross.

    Of course, this was not the scientifically accepted manner of witch-finding. A needle must be used. And, obviously, you needed to be a man.

    Christian stood trial, not for fake witch hunting, torturing or even for those murderous deaths, but for wearing men’s clothing. She was sentenced to transportation, and on May 6 she sailed from the port of Leith on the ship Mary, bound for Barbados.

    On the day she left Scotland, Isobel Elder and Isabel Simson, pricked by John Dickson, aka Christian Caddel, were burned in Forres. Just because you were discovered to be a witch in the wrong way didn’t mean to say you were innocent. They were the last two victims of the cross-dressing counterfeit witch-pricker.

    0
    daily.jstor.org Stonehenge Before the Druids (Long, Long, Before The Druids) - JSTOR Daily

    The clash of academic archaeology and what might be called folk archaeology comes into stark focus at Stonehenge.

    cross-posted from: https://lemm.ee/post/12600657 ***

    Seventeenth-century English antiquarians thought that Stonehenge was built by Celtic Druids. They were relying on the earliest written history they had: Julius Caesar’s narrative of his two unsuccessful invasions of Britain in 54 and 55 BC. Caesar had said the local priests were called Druids. John Aubrey (1626–1697) and William Stukeley (1687–1765) cemented the Stonehenge/Druid connection, while self-styled bard Edward Williams (1747–1826), who changed his name to Iolo Morganwg, invented “authentic” Druidic rituals.

    Druidism has come a long way since. In 2010, The Druid Network was listed as a charity in England and Wales, essentially marking the official recognition of Druidism as a religion. (74,000 called themselves Druids in a recent census.) Historian Carole M. Cusack positions Druidism as one of the branches of the tree of Paganism and/or New Age-ism(s), which burst into all sorts of growth during the twentieth century. Modern Druidism fits into the smorgasbord of what Cusack calls the “deregulated spiritual marketplace” of our times.

    But there’s a disconnect here. In the popular imagination, Stonehenge and Druidism now go together like tea and crumpets. Historically, Stonehenge, a product of Neolithic Britain, predates Caesar by thousands of years. It had nothing to do with Druids and certainly nothing to do with modern Druidism.

    “The false association of [Stonehenge] with the Druids has persisted to the present day,” Cusak writes, “and has become a form of folklore or folk-memory that has enabled modern Druids to obtain access and a degree of respect in their interactions with Stonehenge and other megalithic sites.”

    Meanwhile, archaeologists continue to explore the centuries of construction at Stonehenge and related sites like Durrington Walls and the Avenue that connects Stonehenge to the River Avon. Neolithic Britons seem to have come together to transform Stonehenge into the ring of giant stones—some from 180 miles away—we know today. Questions about construction and chronology continue, but current archeological thinking is dominated by findings and analyses of the Stonehenge Riverside Project of 2004–2009. The Stonehenge Riverside Project’s surveys and excavations made up the first major archeological explorations of Stonehenge and surroundings since the 1980s. The project archaeologists postulate that Stonehenge was a long-term cemetery for cremated remains, with Durrington Walls serving as the residencies and feasting center for its builders.

    The hippie-turned-New Age movements birthed in the 1960s and 1970s resulted in a surge of interest in Stonehenge. Tens of thousands, not all of them Druids, attended the Stonehenge Free People’s Festival starting in 1974. In 1985, the festival was halted by English Heritage, the organization that maintains Stonehenge today, because of the crowds, disorder, and vandalism. Druids were also banned from performing rituals on site. However, English Heritage and the Druids soon came to an understanding: Druids could use the site as long as there was no associated festival.

    So the clash of academic archaeology and what might be called folk archaeology comes into stark focus at Stonehenge.

    Modern paganism is not without interest, of course, but continuing revelations about prehistory—whether of neolithic Britain or elsewhere—should be a lot more interesting. As are the techniques used to extract data from the past: an example used to telling effect by the Stonehenge Riverside Project is the analysis of lipid remains on pottery: we can tell if the pot held dairy products or the fat of ruminants or pigs, giving insights into the diet four thousand years ago. Another example: strontium isotope in bovine molars show that beef consumed at Durrington Walls was raised at least thirty miles away.

    Of course, all this is not as photogenically mysterious/magical as robed Druids in the long shadows of a midwinter sunset. Academic archaeology, which suffers from charges of “elitism” in the reactionary populist politics of anti-intellectualism and anti-science, has a hard time competing with the popular irrationality of mysticism. Maybe the real Stonehenge needs more publicists.

    ***

    Subscribe to !history@lemm.ee and !history@lemmy.ml

    0
    daily.jstor.org Stonehenge Before the Druids (Long, Long, Before The Druids) - JSTOR Daily

    The clash of academic archaeology and what might be called folk archaeology comes into stark focus at Stonehenge.

    cross-posted from: https://lemm.ee/post/12600657 ***

    Seventeenth-century English antiquarians thought that Stonehenge was built by Celtic Druids. They were relying on the earliest written history they had: Julius Caesar’s narrative of his two unsuccessful invasions of Britain in 54 and 55 BC. Caesar had said the local priests were called Druids. John Aubrey (1626–1697) and William Stukeley (1687–1765) cemented the Stonehenge/Druid connection, while self-styled bard Edward Williams (1747–1826), who changed his name to Iolo Morganwg, invented “authentic” Druidic rituals.

    Druidism has come a long way since. In 2010, The Druid Network was listed as a charity in England and Wales, essentially marking the official recognition of Druidism as a religion. (74,000 called themselves Druids in a recent census.) Historian Carole M. Cusack positions Druidism as one of the branches of the tree of Paganism and/or New Age-ism(s), which burst into all sorts of growth during the twentieth century. Modern Druidism fits into the smorgasbord of what Cusack calls the “deregulated spiritual marketplace” of our times.

    But there’s a disconnect here. In the popular imagination, Stonehenge and Druidism now go together like tea and crumpets. Historically, Stonehenge, a product of Neolithic Britain, predates Caesar by thousands of years. It had nothing to do with Druids and certainly nothing to do with modern Druidism.

    “The false association of [Stonehenge] with the Druids has persisted to the present day,” Cusak writes, “and has become a form of folklore or folk-memory that has enabled modern Druids to obtain access and a degree of respect in their interactions with Stonehenge and other megalithic sites.”

    Meanwhile, archaeologists continue to explore the centuries of construction at Stonehenge and related sites like Durrington Walls and the Avenue that connects Stonehenge to the River Avon. Neolithic Britons seem to have come together to transform Stonehenge into the ring of giant stones—some from 180 miles away—we know today. Questions about construction and chronology continue, but current archeological thinking is dominated by findings and analyses of the Stonehenge Riverside Project of 2004–2009. The Stonehenge Riverside Project’s surveys and excavations made up the first major archeological explorations of Stonehenge and surroundings since the 1980s. The project archaeologists postulate that Stonehenge was a long-term cemetery for cremated remains, with Durrington Walls serving as the residencies and feasting center for its builders.

    The hippie-turned-New Age movements birthed in the 1960s and 1970s resulted in a surge of interest in Stonehenge. Tens of thousands, not all of them Druids, attended the Stonehenge Free People’s Festival starting in 1974. In 1985, the festival was halted by English Heritage, the organization that maintains Stonehenge today, because of the crowds, disorder, and vandalism. Druids were also banned from performing rituals on site. However, English Heritage and the Druids soon came to an understanding: Druids could use the site as long as there was no associated festival.

    So the clash of academic archaeology and what might be called folk archaeology comes into stark focus at Stonehenge.

    Modern paganism is not without interest, of course, but continuing revelations about prehistory—whether of neolithic Britain or elsewhere—should be a lot more interesting. As are the techniques used to extract data from the past: an example used to telling effect by the Stonehenge Riverside Project is the analysis of lipid remains on pottery: we can tell if the pot held dairy products or the fat of ruminants or pigs, giving insights into the diet four thousand years ago. Another example: strontium isotope in bovine molars show that beef consumed at Durrington Walls was raised at least thirty miles away.

    Of course, all this is not as photogenically mysterious/magical as robed Druids in the long shadows of a midwinter sunset. Academic archaeology, which suffers from charges of “elitism” in the reactionary populist politics of anti-intellectualism and anti-science, has a hard time competing with the popular irrationality of mysticism. Maybe the real Stonehenge needs more publicists.

    ***

    Subscribe to !history@lemm.ee and !history@lemmy.ml

    0
    daily.jstor.org Stonehenge Before the Druids (Long, Long, Before The Druids) - JSTOR Daily

    The clash of academic archaeology and what might be called folk archaeology comes into stark focus at Stonehenge.

    Seventeenth-century English antiquarians thought that Stonehenge was built by Celtic Druids. They were relying on the earliest written history they had: Julius Caesar’s narrative of his two unsuccessful invasions of Britain in 54 and 55 BC. Caesar had said the local priests were called Druids. John Aubrey (1626–1697) and William Stukeley (1687–1765) cemented the Stonehenge/Druid connection, while self-styled bard Edward Williams (1747–1826), who changed his name to Iolo Morganwg, invented “authentic” Druidic rituals.

    Druidism has come a long way since. In 2010, The Druid Network was listed as a charity in England and Wales, essentially marking the official recognition of Druidism as a religion. (74,000 called themselves Druids in a recent census.) Historian Carole M. Cusack positions Druidism as one of the branches of the tree of Paganism and/or New Age-ism(s), which burst into all sorts of growth during the twentieth century. Modern Druidism fits into the smorgasbord of what Cusack calls the “deregulated spiritual marketplace” of our times.

    But there’s a disconnect here. In the popular imagination, Stonehenge and Druidism now go together like tea and crumpets. Historically, Stonehenge, a product of Neolithic Britain, predates Caesar by thousands of years. It had nothing to do with Druids and certainly nothing to do with modern Druidism.

    “The false association of [Stonehenge] with the Druids has persisted to the present day,” Cusak writes, “and has become a form of folklore or folk-memory that has enabled modern Druids to obtain access and a degree of respect in their interactions with Stonehenge and other megalithic sites.”

    Meanwhile, archaeologists continue to explore the centuries of construction at Stonehenge and related sites like Durrington Walls and the Avenue that connects Stonehenge to the River Avon. Neolithic Britons seem to have come together to transform Stonehenge into the ring of giant stones—some from 180 miles away—we know today. Questions about construction and chronology continue, but current archeological thinking is dominated by findings and analyses of the Stonehenge Riverside Project of 2004–2009. The Stonehenge Riverside Project’s surveys and excavations made up the first major archeological explorations of Stonehenge and surroundings since the 1980s. The project archaeologists postulate that Stonehenge was a long-term cemetery for cremated remains, with Durrington Walls serving as the residencies and feasting center for its builders.

    The hippie-turned-New Age movements birthed in the 1960s and 1970s resulted in a surge of interest in Stonehenge. Tens of thousands, not all of them Druids, attended the Stonehenge Free People’s Festival starting in 1974. In 1985, the festival was halted by English Heritage, the organization that maintains Stonehenge today, because of the crowds, disorder, and vandalism. Druids were also banned from performing rituals on site. However, English Heritage and the Druids soon came to an understanding: Druids could use the site as long as there was no associated festival.

    So the clash of academic archaeology and what might be called folk archaeology comes into stark focus at Stonehenge.

    Modern paganism is not without interest, of course, but continuing revelations about prehistory—whether of neolithic Britain or elsewhere—should be a lot more interesting. As are the techniques used to extract data from the past: an example used to telling effect by the Stonehenge Riverside Project is the analysis of lipid remains on pottery: we can tell if the pot held dairy products or the fat of ruminants or pigs, giving insights into the diet four thousand years ago. Another example: strontium isotope in bovine molars show that beef consumed at Durrington Walls was raised at least thirty miles away.

    Of course, all this is not as photogenically mysterious/magical as robed Druids in the long shadows of a midwinter sunset. Academic archaeology, which suffers from charges of “elitism” in the reactionary populist politics of anti-intellectualism and anti-science, has a hard time competing with the popular irrationality of mysticism. Maybe the real Stonehenge needs more publicists.

    0
    bigthink.com “Eucatastrophe”: Tolkien on the secret to a good fairy tale

    For Nietzsche and Hesiod, hope is the cruelest evil because it prolongs man's torment. For Tolkien and Marcel, hope is all there is.

    “Eucatastrophe”: Tolkien on the secret to a good fairy tale

    • For J.R.R. Tolkien, the single most important element of a fairy tale was the dramatic reversal of misfortune in the story's ending. *

    Key Takeaways

    • In Greek mythology, the story of Pandora's box comes in (at least) two versions. In one, hope is released as the final evil in the world. In another, hope is the only consolation and weapon we have.
    • J.R.R. Tolkien coined the word “eucatastrophe” to describe a hallmark of good fairy tales: Good people win out despite the odds. Hope, in other words, is a vital story component.
    • For Tolkien and the Christian existentialist Gabriel Marcel, hope is the most important disposition we can possess. Without it, the darkness of the world will win out.

    There are at least two versions of the story of Pandora’s box. In the classic version from the Greek poet Hesiod, when Pandora’s curiosity got the better of her, she unleashed into the world all sorts of evils: sickness, famine, death, and people who ask questions at the end of a meeting. When Pandora finally closed the jar, she left only one “evil” inside: hope. For Hesiod, there’s nothing so cruel as hope. Hope is what forces us to carry on building, fixing, and loving when the world offers only destruction, chaos, and heartbreak. It’s what gets us off the ground only to be punched back down. Hope is the naivety of a fool. As Friedrich Nietzsche put it, “Hope, in reality, is the worst of all evils because it prolongs the torments of man.”

    Another variation of the Pandora’s box story is a Greek fable called “Zeus and the Jar of Good Things.” In this account, everything is inverted. The jar does not contain misery but good things. When “mankind” (there’s no Pandora in this version) opened the jar, they let out and lost all these good things: the things that would have made life a paradise. When the lid was closed, there was only one divine blessing left: “Hope alone is still found among the people.”

    The author J.R.R. Tolkien and the Christian existentialist Gabriel Marcel would likely prefer the second version. After all, they considered hope to be perhaps the most important part of being human.

    The eucatastrophe

    Kurt Vonnegut is famous for writing novels like Slaughterhouse-Five and Cat’s Cradle. In storytelling circles, he’s famous for his “shapes of stories.” These were eight diagrams that define the traditional arcs of common stories, like “Boy Meets Girl” or “From Bad to Worse.” His arc about fairy tales goes like this: Things start badly and then get a bit better. But then there’s a catastrophe that brings everything to ruin. The story ends with a drastic upheaval in fortunes — a transformation and magical finale — and everyone lives happily ever after.

    Tolkien, were he alive, would agree. For him, the single most important element of a fairy tale is this final dramatic reversal of misfortune. He coined the word “eucatastrophe” to describe it. “The consolation of fairy-stories [is] the joy of the happy ending: or more correctly of the good catastrophe, the sudden joyous ‘turn,'” Tolkien wrote. The Lord of the Rings does not end with the hobbits dead and Sauron cackling over his orcish, industrial empire. It ends with light beating dark — with simple kindness, love, and companionship winning out over evil.

    Lifting the heart

    Tolkien is very careful to make the point that this is not some form of escapism. It’s not quixotic wish fulfillment. It does not pretend the world is an endlessly happy idyll of singing dwarves and affable wizards. The world has great suffering and misery, and there are plenty of nightmares to be found. The eucatastrophe, though, is “the joy of deliverance; it denies (in the face of much evidence, if you will) universal final defeat.”

    The purpose of a good fairy story is not to hide the shadows of the world. The original Grimms’ Fairy Tales (not the sanitized Disney versions) were full of infanticide, cannibalism, and horror. The mark of a good fairy story, Tolkien wrote, “…[is that] however fantastic or terrible the adventures, it can give to child or man that hears it, when the “turn” comes, a catch of the breath, a beat and lifting of the heart, near to (or indeed accompanied by) tears.”

    Hope is all we have

    The religious undertones here are not accidental. Tolkien was a Catholic who was fond of the redemption and grace found in the narratives of the Bible. Marcel did not, as far as we know, read Tolkien, but his own philosophy of hope bears striking similarities.

    What Tolkien describes as the eucatastrophe, or final deliverance, Marcel called hope. For Marcel, “Hope consists in asserting that there is at the heart of being, beyond all data, beyond all inventories, and all calculations, a mysterious principle which is in connivance with me.”

    Hope is the belief in an order to the Universe — an order where everything will turn out well enough. It is a kind of faith that simply refuses to accept that things are broken, or that misery, suffering, and death are all that exist. Marcel was a Christian, but his account of hope can apply to anyone. The hopeful of the world are those who see the Universe as being on their side. Set against “all experience, all probability, all statistics,” they see that a “given order shall be re-established.” Hope is not a wish. It is not optimism or naivety. It is an assertion. It is telling the world, “No, this is not the way things will be; things will be better.” For both Marcel and Tolkien, it is only with hope that we banish despair.

    You do not haggle with or beg the darkness. Like a blazing torch, you must shine hope brightly and fiercely.

    ***

    But some people don't have the privilege of having hope. Some sit in the dark waiting for bombs and white phosphorus to fall on them, burn them and die as the world watches a genocide occur on live television. I pray for the people of Gaza, none of whom are human animals, for their existence.

    1
    blogs.lse.ac.uk Long Read Review: Hitler’s American Model: The United States and the Making of Nazi Race Law by James Q. Whitman

    In Hitler’s American Model: The United States and the Making of Nazi Race Law, legal scholar James Q. Whitman examines how Nazi Germany looked to the model of the Jim Crow laws in the USA when form…

    cross-posted from: https://lemm.ee/post/10945207

    Long Read Review: Hitler’s American Model: The United States and the Making of Nazi Race Law by James Q. Whitman

    *In Hitler’s American Model: The United States and the Making of Nazi Race Law, legal scholar James Q. Whitman examines how Nazi Germany looked to the model of the Jim Crow laws in the USA when formulating the Nuremberg Laws in the 1930s. This is a carefully researched and timely analysis of how racist ideology can penetrate the political and institutional fabric of societies, furthermore underscoring its continued impact in the USA today, writes Thomas Christie Williams. *

    After the full horrors of Nazism were exposed at the end of World War II, eugenics – in Francis Galton’s words, the ‘science which deals with all influences that improve the inborn qualities of a race’ – as a social and scientific movement slowly faded from public view. The fact that Ronald Fisher, the founder of the modern discipline of genetics, and John Maynard Keynes, the economist whose ideas underpinned the New Deal, were active members of the Eugenics Society is now rarely discussed at Cambridge University, where they spent much of their academic careers. In 1954, the name of scientific journal the Annals of Eugenics was changed to the Annals of Human Genetics, and in 1965 the incoming recipient of the Chair of Eugenics at UCL, Harry Harris, became instead the Galton Professor of Human Genetics.

    However, two groups of people have worked hard to keep memories of this great enthusiasm for a ‘scientific’ approach to institutionalised racism alive. The first are those who see understanding the history of the twentieth century as important, in order that we do not make the same mistakes again. They argue that whilst Nazism was the extreme end of the spectrum, it espoused views on nationality and race that were, if not mainstream, definitely recognised as acceptable by many sectors of society in Europe and the Americas. James Q. Whitman, author of Hitler’s American Model: The United States and the Making of Nazi Race Law, falls into this camp.

    A legal scholar, Whitman identifies many commonalities between Nazi legislation in the early 1930s, which sought to exclude Jews from German public life, and the ‘Jim Crow’ laws enacted to exclude African Americans in the United States. Moving beyond commonalities, he argues that Nazi lawyers and the German public had a keen interest in US race law. As an example, he cites a 1936 article on racial policy in Neues Volk (New Volk), a propaganda newsletter from the National Socialist Office, which included a US map labelled ‘Statutory Restrictions on Negro Rights’, detailing disenfranchisement and anti-miscegenation laws in the 48 mainland US states.

    The second group is the far-right movements arguably edging into the mainstream in the United States and Europe (in Hungary or Holland, for example). The chants of ‘Blood and Soil’ from the recent white supremacist rallies in Charlottesville, Virginia were an explicit reference to the Nazi ideal of ‘Blut und Boden’, and those gathered there are united by their fascination with fascist ideology and rhetoric. Vanguard America argues in its manifesto for an economy ‘free from the influence of international corporations, led by a rootless group of international Jews, which place profit beyond the interests of our people’. Membership of the Nationalist Socialist Movement (described on their website as ‘America’s Premier White Civil Rights Organization’) is ‘open to non-Semitic heterosexuals of European descent’, and a popular blogger for the alt-right, Mike Peinovich, who spoke at Charlottesville, hosts a chatshow entitled ‘The Daily Shoah’.

    Hitler’s American Model is therefore a timely and sobering outline of how racist ideology can make its way into the political fabric of a country. It focuses on the changes introduced by Nazi lawyers post-1933, but we also learn much about how this developed in the United States. Whilst in the latter the case law excluding non-whites from public life developed over decades, in Nazi Germany the Nuremberg Laws were drafted and introduced in 1935, just two years after Hitler became Chancellor. Whitman’s main premise is that in this accelerated process, German lawyers and officials took inspiration and concrete guidance from legal practice across the Atlantic.

    Reading the book, two sets of records stand out, one for their presence, and the other for their absence. The first is the stenographic report of a 5 June 1934 meeting of the Commission on Criminal Law Reform. Whitman’s twenty-page description of this transcript makes for gripping reading, and is the highlight of the book (94-113). The second is the lack of documentation regarding a September 1935 US study tour by 45 German lawyers (132). The trip was apparently a reward for their success in finalising the Nuremberg Race Laws, laid out by Hermann Göring at a rally only a few weeks earlier. As Dr. Heubner, chief of the Nazi Jurists’ Association, told the tour group before they left: ‘through this study trip the upholder of German law [will] gain the necessary compensation for an entire year of work’ (133). According to Whitman, historical record tells us that on arrival in New York at a reception organised by the New York City Bar Association, the group were met by a noisy demonstration lasting six hours and requiring police presence. However, in Whitman’s words: ‘sadly it does not seem possible to learn more about how […] the group fared on their study trip’. From the first set of records we learn much about how German lawyers saw their American counterparts; from the second (missing) set, we might have learnt more about how the American establishment viewed legal developments in the Third Reich.

    Assembled at the 1934 meeting were seventeen lawyers and officials, and their brief was to respond to the demands of the Prussian Memorandum of September 1933. This document argued that the ‘task of the National Socialist State is to check the race-mixing that has been underway in Germany over the course of the centuries, and strive towards the goal of guaranteeing that Nordic blood, which is still determinative in the German people, should put its distinctive stamp on our life again’ (85). The final outcome of such meetings was the Nuremberg Laws, which consisted of three parts. The first, the Flag Law for the Reich, declared the swastika to be the only German national flag. The second, the Citizenship Laws, created a difference between German nationals – ‘any person who belongs to the mutual protection association of the German Reich’ – and the citizen – ‘a national of German blood’ who was the ‘sole bearer of full political rights’ (29). The third, the Nuremberg Blood Laws, made a criminal offence of marriage or extramarital sex between ‘Jews and nationals of German blood’ (31).

    Whitman’s description of the 1934 meeting is gripping for a number of reasons. Firstly, it allows the opportunity to witness the mechanics of discrimination at work. We learn how a group of highly educated professionals – civil servants, legal academics, medical doctors – came together to formulate a set of profoundly exclusionary and undemocratic laws. The committee was faced with a number of questions. How could one define race in legal terms? Could it be possible to criminalise an act (in this case, sexual relations between a German and a Jew) to which two competent parties had consented? Secondly, as a non-American, it further underscores the deeply institutionalised discrimination within US law at this time, belying the idea that a supposedly independent judiciary can act to protect the rights of all citizens.

    In Whitman’s interpretation, two groups were pitted against each other at the 1934 meeting. The first were juristic moderates, who felt that a policy of criminalising German and Jewish sexual relations was not in keeping with the German legal tradition. German criminal law, they argued, was based on clear and unambiguous concepts (105). Race, and in particular Jewishness, was difficult to ‘scientifically’ define (105); judges could not be expected to convict on the basis of vague concepts. Their adversaries were Nazi radicals, who argued that a new Criminal Code should be drawn up using the ‘fundamental principles of National Socialism’ (96). According to Whitman, it was these radicals who championed American law, already touched on in the Prussian Memorandum.

    As it turns out, the American approach to defining race was not greatly troubled by the absence of a scientific conceptualisation. For the Nazi radicals, this was a heartening example. Roland Freisler, a State Secretary attached to the Ministry of Justice, pointed out: ‘How have they gone about doing this [defining race]? They have used different means. Several states have simply employed geographical concepts […] others have conflated matters, combining geographical origin with their conception of a particular circle of blood relatedness’ (107). Freisler continued:

    > they name the races in some more primitive way […] and therefore I am of the opinion that we can proceed with the same primitivity that is used by these American states (109).

    Contrary to established German tradition, Nazi radicals believed that judges should be given freedom to institute racist legislation, without the need to come up with a scientifically satisfactory definition of race.

    It is hard to argue with Whitman’s assertion that Nazi jurists and policymakers took a sustained interest in American race law, and that this helped shape the legal and political climate that led to the promulgation of the Nuremberg Laws. What Whitman moves on to in his conclusion is the extent to which the American legal and political system as a whole, beyond Jim Crow, was permeated with racism: laws related to race-based immigration, race-based citizenship and race-based anti-miscegenation. He makes the unsettling argument that America and Nazi Germany were united by a strong egalitarian, if not libertarian (in the Nazi case), ethos. This ethos, he argues, is that of all white men being equal, and thus it was not surprising that Nazism – in Whitman’s view an egalitarian social revolution for those self-defining as of German origin – turned to America for inspiration. As Whitman points out, white supremacy has a long history in the US, from 1691 when Virginia adopted the first anti-miscegenation statute, to 1790, when the First Congress opened naturalisation to ‘any alien, being a free white person’ (145), to the anti-immigration laws that followed the San Francisco Gold Rush and the segregation laws that followed the Civil War. In the wake of the Charlottesville protests, he would probably argue against Senator John McCain’s assertionthat ‘white supremacists and neo-Nazis are, by definition, opposed to American patriotism and the ideals that define us as a people and make our nation special’.

    Whitman also questions whether the US common law system really serves to protect the freedom of individuals against an over-reaching state. He points out that the Nazis, rather than taking over the pre-existing German civil law system, reformed it according to a common law model. Nazi officials were given discretion to act in what they believed to be the ‘spirit of Hitler’ (149), brushing aside the legal scientific tradition of the moderates of the 1934 meeting. He argues that when it came to race, American ‘legal science’ tended to yield to American politics and left much racist legislation untouched.

    So where does that leave the ‘science’ of eugenics, and the ‘legal science’ of the jurists working in a civil code system? Does a logically consistent approach of any kind protect individual liberties, or rather open up a way to discriminate based on supposedly objective measures? An important point, not explicitly made by Whitman but implicit throughout the book, is that the supposed objectivity of a scientific approach (whether in biology or the law) can easily be misused by those whose aims are clearly undemocratic and unegalitarian. On ‘The Daily Shoah’ and other racist websites, substantial discussion is devoted to ‘metrics’ related to, for example, race and IQ or sexual orientation and the chance of conviction for paedophile offences.

    The Charlottesville protests were sparked by the decision to remove a statue of Robert E. Lee, a Confederate General in the Civil War: proponents of the removal argued that it served as a monument to white supremacy. Conversely, in the United Kingdom, a similar controversy surrounding a petition to remove Cecil Rhodes’s statue in Oriel College Oxford failed to lead to its removal, and the Galton Institute in London (which acknowledges its founding as the Eugenics Education Society in 1907, but disassociates itself from any interest in the theory and practice of eugenics) continues to fund research and award essay prizes on genetics for A Level students. Clearly retaining the material legacy of historical figures runs the risk of allowing their glorification (as in Charlottesville), whitewashing or suggesting implicit sanction of their actions.

    However, in Whitman’s view, to try to forget or ignore these figures and their ongoing influence on society today is the more dangerous option. Hitler’s American Model is a thoughtful and carefully researched account of how the legal community in the US and Germany proved ‘incapable of staving off the dangers of the politicization of criminal law’ (159). He worries that:

    > the story in this book […] is not done yet […] what Roland Freisler saw, and admired, in American race law eighty years ago is still with us in the politics of American criminal justice (160).

    Given recent developments in American politics, this should perhaps give us all pause for thought.

    ***

    Subscribe to !history@lemm.ee :)

    20

    Hitler’s American Model: The United States and the Making of Nazi Race Law by James Q. Whitman | Long Read Review

    blogs.lse.ac.uk Long Read Review: Hitler’s American Model: The United States and the Making of Nazi Race Law by James Q. Whitman

    In Hitler’s American Model: The United States and the Making of Nazi Race Law, legal scholar James Q. Whitman examines how Nazi Germany looked to the model of the Jim Crow laws in the USA when form…

    Long Read Review: Hitler’s American Model: The United States and the Making of Nazi Race Law by James Q. Whitman

    *In Hitler’s American Model: The United States and the Making of Nazi Race Law, legal scholar James Q. Whitman examines how Nazi Germany looked to the model of the Jim Crow laws in the USA when formulating the Nuremberg Laws in the 1930s. This is a carefully researched and timely analysis of how racist ideology can penetrate the political and institutional fabric of societies, furthermore underscoring its continued impact in the USA today, writes Thomas Christie Williams. *

    After the full horrors of Nazism were exposed at the end of World War II, eugenics – in Francis Galton’s words, the ‘science which deals with all influences that improve the inborn qualities of a race’ – as a social and scientific movement slowly faded from public view. The fact that Ronald Fisher, the founder of the modern discipline of genetics, and John Maynard Keynes, the economist whose ideas underpinned the New Deal, were active members of the Eugenics Society is now rarely discussed at Cambridge University, where they spent much of their academic careers. In 1954, the name of scientific journal the Annals of Eugenics was changed to the Annals of Human Genetics, and in 1965 the incoming recipient of the Chair of Eugenics at UCL, Harry Harris, became instead the Galton Professor of Human Genetics.

    However, two groups of people have worked hard to keep memories of this great enthusiasm for a ‘scientific’ approach to institutionalised racism alive. The first are those who see understanding the history of the twentieth century as important, in order that we do not make the same mistakes again. They argue that whilst Nazism was the extreme end of the spectrum, it espoused views on nationality and race that were, if not mainstream, definitely recognised as acceptable by many sectors of society in Europe and the Americas. James Q. Whitman, author of Hitler’s American Model: The United States and the Making of Nazi Race Law, falls into this camp.

    A legal scholar, Whitman identifies many commonalities between Nazi legislation in the early 1930s, which sought to exclude Jews from German public life, and the ‘Jim Crow’ laws enacted to exclude African Americans in the United States. Moving beyond commonalities, he argues that Nazi lawyers and the German public had a keen interest in US race law. As an example, he cites a 1936 article on racial policy in Neues Volk (New Volk), a propaganda newsletter from the National Socialist Office, which included a US map labelled ‘Statutory Restrictions on Negro Rights’, detailing disenfranchisement and anti-miscegenation laws in the 48 mainland US states.

    The second group is the far-right movements arguably edging into the mainstream in the United States and Europe (in Hungary or Holland, for example). The chants of ‘Blood and Soil’ from the recent white supremacist rallies in Charlottesville, Virginia were an explicit reference to the Nazi ideal of ‘Blut und Boden’, and those gathered there are united by their fascination with fascist ideology and rhetoric. Vanguard America argues in its manifesto for an economy ‘free from the influence of international corporations, led by a rootless group of international Jews, which place profit beyond the interests of our people’. Membership of the Nationalist Socialist Movement (described on their website as ‘America’s Premier White Civil Rights Organization’) is ‘open to non-Semitic heterosexuals of European descent’, and a popular blogger for the alt-right, Mike Peinovich, who spoke at Charlottesville, hosts a chatshow entitled ‘The Daily Shoah’.

    Hitler’s American Model is therefore a timely and sobering outline of how racist ideology can make its way into the political fabric of a country. It focuses on the changes introduced by Nazi lawyers post-1933, but we also learn much about how this developed in the United States. Whilst in the latter the case law excluding non-whites from public life developed over decades, in Nazi Germany the Nuremberg Laws were drafted and introduced in 1935, just two years after Hitler became Chancellor. Whitman’s main premise is that in this accelerated process, German lawyers and officials took inspiration and concrete guidance from legal practice across the Atlantic.

    Reading the book, two sets of records stand out, one for their presence, and the other for their absence. The first is the stenographic report of a 5 June 1934 meeting of the Commission on Criminal Law Reform. Whitman’s twenty-page description of this transcript makes for gripping reading, and is the highlight of the book (94-113). The second is the lack of documentation regarding a September 1935 US study tour by 45 German lawyers (132). The trip was apparently a reward for their success in finalising the Nuremberg Race Laws, laid out by Hermann Göring at a rally only a few weeks earlier. As Dr. Heubner, chief of the Nazi Jurists’ Association, told the tour group before they left: ‘through this study trip the upholder of German law [will] gain the necessary compensation for an entire year of work’ (133). According to Whitman, historical record tells us that on arrival in New York at a reception organised by the New York City Bar Association, the group were met by a noisy demonstration lasting six hours and requiring police presence. However, in Whitman’s words: ‘sadly it does not seem possible to learn more about how […] the group fared on their study trip’. From the first set of records we learn much about how German lawyers saw their American counterparts; from the second (missing) set, we might have learnt more about how the American establishment viewed legal developments in the Third Reich.

    Assembled at the 1934 meeting were seventeen lawyers and officials, and their brief was to respond to the demands of the Prussian Memorandum of September 1933. This document argued that the ‘task of the National Socialist State is to check the race-mixing that has been underway in Germany over the course of the centuries, and strive towards the goal of guaranteeing that Nordic blood, which is still determinative in the German people, should put its distinctive stamp on our life again’ (85). The final outcome of such meetings was the Nuremberg Laws, which consisted of three parts. The first, the Flag Law for the Reich, declared the swastika to be the only German national flag. The second, the Citizenship Laws, created a difference between German nationals – ‘any person who belongs to the mutual protection association of the German Reich’ – and the citizen – ‘a national of German blood’ who was the ‘sole bearer of full political rights’ (29). The third, the Nuremberg Blood Laws, made a criminal offence of marriage or extramarital sex between ‘Jews and nationals of German blood’ (31).

    Whitman’s description of the 1934 meeting is gripping for a number of reasons. Firstly, it allows the opportunity to witness the mechanics of discrimination at work. We learn how a group of highly educated professionals – civil servants, legal academics, medical doctors – came together to formulate a set of profoundly exclusionary and undemocratic laws. The committee was faced with a number of questions. How could one define race in legal terms? Could it be possible to criminalise an act (in this case, sexual relations between a German and a Jew) to which two competent parties had consented? Secondly, as a non-American, it further underscores the deeply institutionalised discrimination within US law at this time, belying the idea that a supposedly independent judiciary can act to protect the rights of all citizens.

    In Whitman’s interpretation, two groups were pitted against each other at the 1934 meeting. The first were juristic moderates, who felt that a policy of criminalising German and Jewish sexual relations was not in keeping with the German legal tradition. German criminal law, they argued, was based on clear and unambiguous concepts (105). Race, and in particular Jewishness, was difficult to ‘scientifically’ define (105); judges could not be expected to convict on the basis of vague concepts. Their adversaries were Nazi radicals, who argued that a new Criminal Code should be drawn up using the ‘fundamental principles of National Socialism’ (96). According to Whitman, it was these radicals who championed American law, already touched on in the Prussian Memorandum.

    As it turns out, the American approach to defining race was not greatly troubled by the absence of a scientific conceptualisation. For the Nazi radicals, this was a heartening example. Roland Freisler, a State Secretary attached to the Ministry of Justice, pointed out: ‘How have they gone about doing this [defining race]? They have used different means. Several states have simply employed geographical concepts […] others have conflated matters, combining geographical origin with their conception of a particular circle of blood relatedness’ (107). Freisler continued:

    > they name the races in some more primitive way […] and therefore I am of the opinion that we can proceed with the same primitivity that is used by these American states (109).

    Contrary to established German tradition, Nazi radicals believed that judges should be given freedom to institute racist legislation, without the need to come up with a scientifically satisfactory definition of race.

    It is hard to argue with Whitman’s assertion that Nazi jurists and policymakers took a sustained interest in American race law, and that this helped shape the legal and political climate that led to the promulgation of the Nuremberg Laws. What Whitman moves on to in his conclusion is the extent to which the American legal and political system as a whole, beyond Jim Crow, was permeated with racism: laws related to race-based immigration, race-based citizenship and race-based anti-miscegenation. He makes the unsettling argument that America and Nazi Germany were united by a strong egalitarian, if not libertarian (in the Nazi case), ethos. This ethos, he argues, is that of all white men being equal, and thus it was not surprising that Nazism – in Whitman’s view an egalitarian social revolution for those self-defining as of German origin – turned to America for inspiration. As Whitman points out, white supremacy has a long history in the US, from 1691 when Virginia adopted the first anti-miscegenation statute, to 1790, when the First Congress opened naturalisation to ‘any alien, being a free white person’ (145), to the anti-immigration laws that followed the San Francisco Gold Rush and the segregation laws that followed the Civil War. In the wake of the Charlottesville protests, he would probably argue against Senator John McCain’s assertionthat ‘white supremacists and neo-Nazis are, by definition, opposed to American patriotism and the ideals that define us as a people and make our nation special’.

    Whitman also questions whether the US common law system really serves to protect the freedom of individuals against an over-reaching state. He points out that the Nazis, rather than taking over the pre-existing German civil law system, reformed it according to a common law model. Nazi officials were given discretion to act in what they believed to be the ‘spirit of Hitler’ (149), brushing aside the legal scientific tradition of the moderates of the 1934 meeting. He argues that when it came to race, American ‘legal science’ tended to yield to American politics and left much racist legislation untouched.

    So where does that leave the ‘science’ of eugenics, and the ‘legal science’ of the jurists working in a civil code system? Does a logically consistent approach of any kind protect individual liberties, or rather open up a way to discriminate based on supposedly objective measures? An important point, not explicitly made by Whitman but implicit throughout the book, is that the supposed objectivity of a scientific approach (whether in biology or the law) can easily be misused by those whose aims are clearly undemocratic and unegalitarian. On ‘The Daily Shoah’ and other racist websites, substantial discussion is devoted to ‘metrics’ related to, for example, race and IQ or sexual orientation and the chance of conviction for paedophile offences.

    The Charlottesville protests were sparked by the decision to remove a statue of Robert E. Lee, a Confederate General in the Civil War: proponents of the removal argued that it served as a monument to white supremacy. Conversely, in the United Kingdom, a similar controversy surrounding a petition to remove Cecil Rhodes’s statue in Oriel College Oxford failed to lead to its removal, and the Galton Institute in London (which acknowledges its founding as the Eugenics Education Society in 1907, but disassociates itself from any interest in the theory and practice of eugenics) continues to fund research and award essay prizes on genetics for A Level students. Clearly retaining the material legacy of historical figures runs the risk of allowing their glorification (as in Charlottesville), whitewashing or suggesting implicit sanction of their actions.

    However, in Whitman’s view, to try to forget or ignore these figures and their ongoing influence on society today is the more dangerous option. Hitler’s American Model is a thoughtful and carefully researched account of how the legal community in the US and Germany proved ‘incapable of staving off the dangers of the politicization of criminal law’ (159). He worries that:

    > the story in this book […] is not done yet […] what Roland Freisler saw, and admired, in American race law eighty years ago is still with us in the politics of American criminal justice (160).

    Given recent developments in American politics, this should perhaps give us all pause for thought.

    0
    gamerant.com Lord of the Rings: Gollum Apology Was Written Using ChatGPT, According to Report

    A new report claims the apology posted for Lord of the Rings: Gollum was written using ChatGPT without the dev team's consent.

    A new report claims the apology posted for Lord of the Rings: Gollum was written using ChatGPT without the dev team's consent.

    • The Lord of the Rings: Gollum game received a negative response due to technical problems, derivative gameplay, uninteresting narrative, and poor graphics.
    • The apology posted by Nacon was generated by an AI-powered text generator, ChatGPT, without the knowledge of the developer, Daedalic Entertainment.
    • The game's troubled development was attributed to a lack of funds and time, leading to downscaled features and a rushed release.

    ***

    According to a recent report, Daedalic Entertainment, the developer behind the infamous The Lord of the Rings: Gollum game, has claimed that the apology was generated by the AI-powered text generator, ChatGPT. This report also states the developers had no knowledge of this apology being written and that it was a decision from its publisher, Nacon. Alongside that, Daedelic Entertainment employees also went into detail about what had gone wrong with Lord of the Rings: Gollum during development.

    Earlier this year, the licensed game had launched to a rather overwhelming negative response, evident with how Lord of the Rings: Gollum became one of the lowest-rated games of 2023. Several critics and fans have cited the game's technical problems being the biggest factors to this negative reception, compounded by the gameplay that many found to be derivative and uninteresting. The graphics weren't a big selling point either, nor was its narrative compelling to many. As such, the game failed to deliver on all fronts to many people, with gamers walking away unimpressed with the big licensed game. The developers have recently spoken up about its troubled development and the source of the struggles the team faced.

    Anonymous employees from Daedalic Entertainment were interviewed by German gaming outlet GameTwo, with some discussing the relationship between the developers and its publisher, Nacon. One thing that was brought up was the apology regarding Gollum's troubled launch that was posted to the game's official Twitter account. According to Daedalic, this apology was written using the ChatGPT software, to which the developer had no knowledge of it being written or its content prior to its publication, claiming it was all handled by Nacon.

    > My favorit part. This nonpology from Nacon was written with ChatGPT. pic.twitter.com/N0ZtX2I6WZ — Knoebel (@Knoebelbroet) October 7, 2023

    Regarding the apology made for The Lord of the Rings: Gollum, many gamers stated that the revelation of Nacon having used ChatGPT to generate it was the reason why it had seemed noncommittal and disingenuous. The biggest indicator in hindsight of how this apology was written without any oversight was the misspelling of the game's title, addressing it as "The Lord of the Ring: Gollum" in the post.

    The average development budget of a AAA game in 2023 is usually around $50-$300 million dollars, with Gollum's budget being a more modest 15 million Euros. This lack of funds and time was a big contributing factor to why the game was released in the state it was, according to a former senior designer. The developers went into how a lot of features had to be downscaled due to this, such as one scene having to be restricted to only seeing Gollum eavesdrop on two major characters since they had no time to animate the characters. With this report, it's possible that more blame could fall on the game's publisher, Nacon, rather than the developers.

    The Lord of the Rings: Gollum is available now for PC, PS4, PS5, Xbox One, and Xbox Series X/S, with a Switch version to come at a later date.

    Links:

    https://www.videogameschronicle.com/news/chatgpt-was-used-to-write-gollum-game-apology-its-claimed/

    https://www.dexerto.com/gaming/lord-of-the-rings-gollum-apology-reportedly-written-by-chat-gpt-2327768/

    5
    www.independent.co.uk How Moomin creator Tove Jansson found her dark side illustrating Tolkien and Carroll

    She thought Lewis Carroll was ‘pathological’. Her Gollum was so monstrous that JRR Tolkien amended his book’s text – but copies of Tove Jansson’s illustrated edition of ‘The Hobbit’ fly off shelves even though they remain in the original Finnish. Susie Mesure visits a new exhibition that shows how t...

    How Moomin creator Tove Jansson found her dark side illustrating Tolkien and Carroll

    She thought Lewis Carroll was ‘pathological’. Her Gollum was so monstrous that JRR Tolkien amended his book’s text – but copies of Tove Jansson’s illustrated edition of ‘The Hobbit’ fly off shelves even though they remain in the original Finnish. Susie Mesure visits a new exhibition that shows how the brain behind the Moomins turned her vividly macabre eye to transform other classic books

    In November 1960, Astrid Lindgren got in touch with Tove Jansson, the creator of the Moomins. Lindgren, who wrote the Swedish children’s classic Pippi Longstocking, was also a publisher – and she begged Jansson to turn her imagination to the works of JRR Tolkien. “Who will comfort Astrid if you don’t agree to the proposal I’m now going to make to you?” Lindgren wrote in a letter, riffing on the title of another of Jansson’s recent picture books, Who Will Comfort Toffle?

    In the UK, as in her native Finland, Tove Jansson is, of course, best known for the Moomins, an adventurous family of fantastical creatures who live in a magical valley on the edge of a Finnish archipelago. A Moomin comic strip ran in London’s Evening News from 1954 until 1975, reaching millions of readers across the Commonwealth, and, more recently, the first new animation series about the Moomins for nearly three decades – Moominvalley – brought the white trolls to life for a new, younger generation. Fans range from devotees who grew up on books such as Finn Family Moomintroll or Comet in Moominland, to those with a penchant for collecting Moomin mugs, which Moomin Characters, the family-owned company that looks after Jansson’s legacy, still churns out year after year.

    Less is known, however, about the success Jansson, a Finnish icon who died in 2001 aged 86, had illustrating the work of other writers, something a new exhibition in Paris is putting under the spotlight. Houses of Tove explores how Jansson was so much more than a comic strip creator, a job she came to loathe because it kept her from her true passions: painting and writing. The show includes a first edition of The Hobbit, or Bilbo – en hobbits äventyr, as it is known in Swedish, which Jansson jumped at the chance to illustrate for Lindgren. It also features a number of preparatory sketches she made for the commission, which were used in a 1973 Finnish translation: the first edition, featuring a wonderful red dragon hovering above a tiny army scaling jagged peaks, is on display.

    Images

    spoiler

    !image !iamge !imgae !imaeg

    2
    www.bbc.com When 'fairy mania' gripped Britain

    "Fairycore" may be trending on social media today but 100 years ago supernatural sprites were a national obsession. Holly Williams explores fairy fever.

    cross-posted from: https://lemm.ee/post/10662845# When Britain was gripped by 'fairy mania'

    "Fairycore" may be trending on social media today but 100 years ago supernatural sprites were a national obsession. Holly Williams explores fairy fever.

    Imagine a fairy. Is the picture that appears in your mind's eye a tiny, pretty, magical figure – a childish wisp with insect-like wings and a dress made of petals?

    If so, it's likely you've been influenced by Cicely Mary Barker, the British illustrator who created the Flower Fairies. 2023 marks 100 years since the publication of her first book of poems and pictures, Flower Fairies of the Spring – an anniversary currently being celebrated in an exhibition at the Lady Lever Gallery in Merseyside, UK.

    The Flower Fairies' influence has endured: they have never been out of print, and continue to be popular around the world – big in Japan and in Italy, where Gucci released a children's range featuring Barker's prints in 2022. Billie Eilish recently had Flower Fairies tattooed on her hand, while their whimsical, floral aesthetic can be seen in the TikTok "fairycore" trend.

    !image

    (An exhibition at the Lady Lever Art Gallery explores the Flower Fairies phenomenon, and features pantomime costumes (Credit: Pete Carr))

    Barker's delicate watercolours certainly helped cement several tropes we now consider classic – almost essential, in fact – in the iconography of the fairy: they are miniature, sweet and youthful, they are intertwined with plants and the natural world, and they are distinctly twee. Yet her drawings were also "firmly footed in realism" points out Fiona Slattery Clark, curator of the show. "The children were all painted from life [and] her plants and flowers are as realistic as possible." Barker drew children from the nursery school her sister ran in their house in Croydon near London; each was assigned a flower or tree, and Barker's detailed illustrations were botanically accurate – she would source samples from Kew Gardens, says Slattery Clark. Even the petal-like wings and fairy outfits were closely based on plants: an acorn cup becoming a jaunty cap, a harebell becoming a prettily scalloped skirt.

    > For many hundreds of years, fairies were not necessarily tiny and fey, but grotesque or fierce elemental forces

    The Flower Fairies were an immediate hit – but Barker was far from the only artist of her era to find success with fairies. In fact, fairy fever swelled within the United Kingdom for over half a century, reaching something of a peak around the time the Flower Fairies emerged in 1923. Over 350 fairy books were published in the UK between 1920 and 1925, including in Enid Blyton's first fairy foray, a collection of poems called Real Fairies in 1923. Fairy art even had the stamp of royal approval: Queen Mary was a fan of Ida Rentoul Outhwaite's ethereal drawings, and helped popularise them by sending them in postcard form.

    Fairies have long been with us – in our imaginations, at least. But for many hundreds of years, they were not necessarily tiny and fey, but grotesque or fierce elemental forces, capable of great darkness. "In 1800, if you thought your child was a fairy it would have been like demonic possession – you would have put that child in the fire to drive out the fairy," points out Alice Sage, a curator and historian.

    !image

    (Each of Barker's fairies corresponded to a plant, tree or flower – pictured, the Silver Birch Fairy (Credit: Estate of Cicely Mary Barker 1934 Flower Fairies))

    Yet within 100 years, the whole conception of fairies completely changed. "Throughout the 19th Century, fairies became increasingly miniaturised, sapped of their power – trapped in the nursery," says Sage. As the Victorian era progressed, they are increasingly associated with childhood; as their popularity grew, they shrank.

    But first, fairies became a fashionable subject for Victorian artists, often taking inspiration from Shakespeare's A Midsummer Night's Dream and The Tempest. John Anster Fitzgerald, Edwin Landseer, John Everett Millais, Joseph Noel Paton, Arthur Rackham and even JMW Turner – among many others – painted supernatural sprites from the 1840s onwards. But there was still a sense of otherworldly strangeness in many of their depictions – as seen in the work of Richard Dadd, who made his hyper-intricate fairy paintings while living in a Victorian asylum after killing his father.

    Then two wider cultural developments came along that changed fairy reputations forever. One was that "children's literature happened", says Sage. The Victorians promoted the idea of childhood as a time of innocence, requiring its own entertainment. Illustrated children's books really took off from the 1870s, with fairies a staple, and increasingly cutesy, feature. The second was pantomime. "Every Victorian pantomime would have this big spectacle of transformation at the end, where children dressed as fairies filled the stage," says Sage. The standard fairy fancy dress outfit today is basically the same as what these Victorian children would have worn: think tinsel, sparkly sequins, and translucent, gauzy wings.

    Huge popularity

    Moving into the 20th Century, fairies showed few signs of buzzing off – if anything, they cemented their place. "In the Edwardian era, Peter Pan started to be performed [in 1904], and that carried on for the next 25 years," points out Slattery Clark – enough time for several generations of children to learn to clap their hands to show they believe in fairies.

    And as the new century lurched through global upheaval via World War One, fairy mania continued – if anything, widening and deepening. "That golden age of children's literature is really an upper middle-class phenomenon," points out Sage. "What happened from World War One onwards is it explodes beyond that, and becomes an adult concern."

    !image

    (The costumes displayed in the exhibition are based on the Flower Fairies illustrations (Credit: Pete Carr))

    Having been whisked from the woods into the nursery, fairies then made their way to troubled adults on the battlefield or waiting at home. Consider the huge popularity of a print, Piper of Dreams by Estella Canziani, during World War One: a wispy image of a man playing a pipe and surrounded by tiny fairies, it sold a staggering quarter of a million copies in just 1916.

    "It's about belief and it's about hope – that's what fairies represent in that time," says Sage. "The supernatural becomes a way of finding some luck and brightness, [when] people don't have control over their lives, their future, their families."

    > For Conan Doyle, it was all about a search for another realm of being that related to life after death, vibrations, telepathy, telekinesis – Alice Sage

    Today, we associate fairies with little girls – but this was an era when fairy art was popular with grown men, too. And technology helped spread it: there was an explosion in sending postcards around this time. They were cheap to buy, and free to post to a serving soldier abroad. "Suddenly everyone can send fairies, and they're flying through the air and across the seas. You can’t underestimate the practical aspect of it," says Sage.

    Indeed, Barker herself cut her teeth illustrating such postcards: she produced a patriotic series showing "Children of the Allies", in different forms of national dress, in 1915, followed by a series of characters from Shakespeare, before teasing the Flower Fairies with a set of "Fairies and Elves" postcards in 1918.

    Barker never made any claims for fairies being real – "I have never seen a fairy", she wrote in a foreword to Flower Fairies of the Wayside. But it is worth noting that she first published the Flower Fairies at a moment when the desire to believe in magical beings was at a rare high. In 1920, Britain was gripped by the story of the Cottingley Fairies, after two girls claimed to have photographed fairies at the bottom of their garden in West Yorkshire – and were widely believed.

    Their beautiful photographs were created by paper cut-outs, floating on hat pins. Although many were sceptical, they nonetheless also fooled many of the great and the good – the photographs were brought to prominence by no less than Sir Arthur Conan Doyle, the author of Sherlock Holmes, who wrote a whole book about it, The Coming of the Fairies, in 1922.

    !image

    (The Crocus Fairies from Flower Fairies of the Spring – the watercolours are still popular today with “fairycore” fans (Credit: Estate of Cicely Mary Barker 1934 Flower Fairies))

    Cousins Elsie Wright and Frances Griffiths were aged 16 and nine when they took the first photos. Many years later, in the 1980s, they admitted it was a hoax, explaining that they kept up the pretence that the fairies were real a because they felt sorry for the middle-aged men, like Conan Doyle, that so wanted to believe. There was, at the time, a serious resurgence in spiritualism in the UK, with seances and attempts to contact the dead proving understandably tempting for the bereaved. Conan Doyle himself became interested in a spirit world after his son died in the war. And for believers, this wasn't "woo-woo" nonsense – it was supposedly based in science. After all, scientific advances were genuinely explaining hitherto unknown and invisible aspects of our world.

    "For Conan Doyle, it was all about a search for another realm of being that related to life after death, vibrations, telepathy, telekinesis – this fascinating world on the edge of the limits of human perception," says Sage. "And obviously that's connected to the loss of his son in World War One."

    Like the Flower Fairies, the Cottingley photographs further reinforced the association between children and fairies, as well as cementing what a fairy looked like in the public consciousness. Yet aside from Tinkerbell, Flower Fairies are probably the only image from the fairy-fever era still instantly recognisable today. Why, of all the fairy content out there, have Barker's images endured so strongly over the past 100 years?

    "They were [originally published] in full colour, and a lot of books were published in black and white," begins Sage. What looked novel at the time, now seems charmingly period – but the delicacy, intricacy, and imagination of Barker's pictures can still cast a spell. "It's like dolls houses – things that are very miniaturised, but very detailed and realistic, scratch a certain itch," suggests Sage. "They are absolutely beautiful, which helps."

    "It's a real celebration of nature – there is a strong educational aspect to her work," puts forward Slattery Clark, emphasising the botanical accuracy of Barker’s drawings. The educational argument might sound absurd given we're discussing fairy art, but as a child who was obsessed with Flower Fairies, I can attest to the truth of it: all the wildflowers I know the names of I learned from these books.

    !image

    (Cicely Mary Barker's exquisite illustrations were hugely popular in the 1920s (Credit: Estate of Cicely Mary Barker))

    Having each fairy very specifically related to a particular plant was also commercially canny – whether Barker intended this or not, it created space for identification, for collectability, for a kind of innate brand franchising. "In children's culture, we create series that are collectable, that you identify with… It's like Pokemon or something!" laughs Sage. "When I speak to people about the Flower Fairies, especially groups of sisters, it's always 'which one were you?'"

    Still, Sage is pleased to see the Flower Fairies exhibited in a fine art context at the Lady Lever gallery. For a long time, men painting fairies has been considered art – but when women do it, it's just silly flowery stuff for children.

    "This is fine art – it's mass, popular fine art," insists Sage. "I think a lot of the diminishment of fairies and children's illustration is from a misogynist, snobbish and elitist art historical tradition. I'm so excited to see this kind of exhibition, that reclaims this history." Consider this a beating of wings, then, that takes fairies back out of the nursery – and into the gallery.

    Flower Fairies is at the Lady Lever Art Gallery, Port Sunlight Village, UK until 5 November.

    Holly Williams' novel What Time is Love? is out in paperback now.___

    ***

    Join !fantasy@lemm.ml

    0
    www.bbc.com When Britain was gripped by 'fairy mania'

    "Fairycore" may be trending on social media today but 100 years ago supernatural sprites were a national obsession. Holly Williams explores fairy fever.

    When Britain was gripped by 'fairy mania'

    "Fairycore" may be trending on social media today but 100 years ago supernatural sprites were a national obsession. Holly Williams explores fairy fever.

    Imagine a fairy. Is the picture that appears in your mind's eye a tiny, pretty, magical figure – a childish wisp with insect-like wings and a dress made of petals?

    If so, it's likely you've been influenced by Cicely Mary Barker, the British illustrator who created the Flower Fairies. 2023 marks 100 years since the publication of her first book of poems and pictures, Flower Fairies of the Spring – an anniversary currently being celebrated in an exhibition at the Lady Lever Gallery in Merseyside, UK.

    The Flower Fairies' influence has endured: they have never been out of print, and continue to be popular around the world – big in Japan and in Italy, where Gucci released a children's range featuring Barker's prints in 2022. Billie Eilish recently had Flower Fairies tattooed on her hand, while their whimsical, floral aesthetic can be seen in the TikTok "fairycore" trend.

    !image

    (An exhibition at the Lady Lever Art Gallery explores the Flower Fairies phenomenon, and features pantomime costumes (Credit: Pete Carr))

    Barker's delicate watercolours certainly helped cement several tropes we now consider classic – almost essential, in fact – in the iconography of the fairy: they are miniature, sweet and youthful, they are intertwined with plants and the natural world, and they are distinctly twee. Yet her drawings were also "firmly footed in realism" points out Fiona Slattery Clark, curator of the show. "The children were all painted from life [and] her plants and flowers are as realistic as possible." Barker drew children from the nursery school her sister ran in their house in Croydon near London; each was assigned a flower or tree, and Barker's detailed illustrations were botanically accurate – she would source samples from Kew Gardens, says Slattery Clark. Even the petal-like wings and fairy outfits were closely based on plants: an acorn cup becoming a jaunty cap, a harebell becoming a prettily scalloped skirt.

    > For many hundreds of years, fairies were not necessarily tiny and fey, but grotesque or fierce elemental forces

    The Flower Fairies were an immediate hit – but Barker was far from the only artist of her era to find success with fairies. In fact, fairy fever swelled within the United Kingdom for over half a century, reaching something of a peak around the time the Flower Fairies emerged in 1923. Over 350 fairy books were published in the UK between 1920 and 1925, including in Enid Blyton's first fairy foray, a collection of poems called Real Fairies in 1923. Fairy art even had the stamp of royal approval: Queen Mary was a fan of Ida Rentoul Outhwaite's ethereal drawings, and helped popularise them by sending them in postcard form.

    Fairies have long been with us – in our imaginations, at least. But for many hundreds of years, they were not necessarily tiny and fey, but grotesque or fierce elemental forces, capable of great darkness. "In 1800, if you thought your child was a fairy it would have been like demonic possession – you would have put that child in the fire to drive out the fairy," points out Alice Sage, a curator and historian.

    !image

    (Each of Barker's fairies corresponded to a plant, tree or flower – pictured, the Silver Birch Fairy (Credit: Estate of Cicely Mary Barker 1934 Flower Fairies))

    Yet within 100 years, the whole conception of fairies completely changed. "Throughout the 19th Century, fairies became increasingly miniaturised, sapped of their power – trapped in the nursery," says Sage. As the Victorian era progressed, they are increasingly associated with childhood; as their popularity grew, they shrank.

    But first, fairies became a fashionable subject for Victorian artists, often taking inspiration from Shakespeare's A Midsummer Night's Dream and The Tempest. John Anster Fitzgerald, Edwin Landseer, John Everett Millais, Joseph Noel Paton, Arthur Rackham and even JMW Turner – among many others – painted supernatural sprites from the 1840s onwards. But there was still a sense of otherworldly strangeness in many of their depictions – as seen in the work of Richard Dadd, who made his hyper-intricate fairy paintings while living in a Victorian asylum after killing his father.

    Then two wider cultural developments came along that changed fairy reputations forever. One was that "children's literature happened", says Sage. The Victorians promoted the idea of childhood as a time of innocence, requiring its own entertainment. Illustrated children's books really took off from the 1870s, with fairies a staple, and increasingly cutesy, feature. The second was pantomime. "Every Victorian pantomime would have this big spectacle of transformation at the end, where children dressed as fairies filled the stage," says Sage. The standard fairy fancy dress outfit today is basically the same as what these Victorian children would have worn: think tinsel, sparkly sequins, and translucent, gauzy wings.

    Huge popularity

    Moving into the 20th Century, fairies showed few signs of buzzing off – if anything, they cemented their place. "In the Edwardian era, Peter Pan started to be performed [in 1904], and that carried on for the next 25 years," points out Slattery Clark – enough time for several generations of children to learn to clap their hands to show they believe in fairies.

    And as the new century lurched through global upheaval via World War One, fairy mania continued – if anything, widening and deepening. "That golden age of children's literature is really an upper middle-class phenomenon," points out Sage. "What happened from World War One onwards is it explodes beyond that, and becomes an adult concern."

    !image

    (The costumes displayed in the exhibition are based on the Flower Fairies illustrations (Credit: Pete Carr))

    Having been whisked from the woods into the nursery, fairies then made their way to troubled adults on the battlefield or waiting at home. Consider the huge popularity of a print, Piper of Dreams by Estella Canziani, during World War One: a wispy image of a man playing a pipe and surrounded by tiny fairies, it sold a staggering quarter of a million copies in just 1916.

    "It's about belief and it's about hope – that's what fairies represent in that time," says Sage. "The supernatural becomes a way of finding some luck and brightness, [when] people don't have control over their lives, their future, their families."

    > For Conan Doyle, it was all about a search for another realm of being that related to life after death, vibrations, telepathy, telekinesis – Alice Sage

    Today, we associate fairies with little girls – but this was an era when fairy art was popular with grown men, too. And technology helped spread it: there was an explosion in sending postcards around this time. They were cheap to buy, and free to post to a serving soldier abroad. "Suddenly everyone can send fairies, and they're flying through the air and across the seas. You can’t underestimate the practical aspect of it," says Sage.

    Indeed, Barker herself cut her teeth illustrating such postcards: she produced a patriotic series showing "Children of the Allies", in different forms of national dress, in 1915, followed by a series of characters from Shakespeare, before teasing the Flower Fairies with a set of "Fairies and Elves" postcards in 1918.

    Barker never made any claims for fairies being real – "I have never seen a fairy", she wrote in a foreword to Flower Fairies of the Wayside. But it is worth noting that she first published the Flower Fairies at a moment when the desire to believe in magical beings was at a rare high. In 1920, Britain was gripped by the story of the Cottingley Fairies, after two girls claimed to have photographed fairies at the bottom of their garden in West Yorkshire – and were widely believed.

    Their beautiful photographs were created by paper cut-outs, floating on hat pins. Although many were sceptical, they nonetheless also fooled many of the great and the good – the photographs were brought to prominence by no less than Sir Arthur Conan Doyle, the author of Sherlock Holmes, who wrote a whole book about it, The Coming of the Fairies, in 1922.

    !image

    (The Crocus Fairies from Flower Fairies of the Spring – the watercolours are still popular today with “fairycore” fans (Credit: Estate of Cicely Mary Barker 1934 Flower Fairies))

    Cousins Elsie Wright and Frances Griffiths were aged 16 and nine when they took the first photos. Many years later, in the 1980s, they admitted it was a hoax, explaining that they kept up the pretence that the fairies were real a because they felt sorry for the middle-aged men, like Conan Doyle, that so wanted to believe. There was, at the time, a serious resurgence in spiritualism in the UK, with seances and attempts to contact the dead proving understandably tempting for the bereaved. Conan Doyle himself became interested in a spirit world after his son died in the war. And for believers, this wasn't "woo-woo" nonsense – it was supposedly based in science. After all, scientific advances were genuinely explaining hitherto unknown and invisible aspects of our world.

    "For Conan Doyle, it was all about a search for another realm of being that related to life after death, vibrations, telepathy, telekinesis – this fascinating world on the edge of the limits of human perception," says Sage. "And obviously that's connected to the loss of his son in World War One."

    Like the Flower Fairies, the Cottingley photographs further reinforced the association between children and fairies, as well as cementing what a fairy looked like in the public consciousness. Yet aside from Tinkerbell, Flower Fairies are probably the only image from the fairy-fever era still instantly recognisable today. Why, of all the fairy content out there, have Barker's images endured so strongly over the past 100 years?

    "They were [originally published] in full colour, and a lot of books were published in black and white," begins Sage. What looked novel at the time, now seems charmingly period – but the delicacy, intricacy, and imagination of Barker's pictures can still cast a spell. "It's like dolls houses – things that are very miniaturised, but very detailed and realistic, scratch a certain itch," suggests Sage. "They are absolutely beautiful, which helps."

    "It's a real celebration of nature – there is a strong educational aspect to her work," puts forward Slattery Clark, emphasising the botanical accuracy of Barker’s drawings. The educational argument might sound absurd given we're discussing fairy art, but as a child who was obsessed with Flower Fairies, I can attest to the truth of it: all the wildflowers I know the names of I learned from these books.

    !image

    (Cicely Mary Barker's exquisite illustrations were hugely popular in the 1920s (Credit: Estate of Cicely Mary Barker))

    Having each fairy very specifically related to a particular plant was also commercially canny – whether Barker intended this or not, it created space for identification, for collectability, for a kind of innate brand franchising. "In children's culture, we create series that are collectable, that you identify with… It's like Pokemon or something!" laughs Sage. "When I speak to people about the Flower Fairies, especially groups of sisters, it's always 'which one were you?'"

    Still, Sage is pleased to see the Flower Fairies exhibited in a fine art context at the Lady Lever gallery. For a long time, men painting fairies has been considered art – but when women do it, it's just silly flowery stuff for children.

    "This is fine art – it's mass, popular fine art," insists Sage. "I think a lot of the diminishment of fairies and children's illustration is from a misogynist, snobbish and elitist art historical tradition. I'm so excited to see this kind of exhibition, that reclaims this history." Consider this a beating of wings, then, that takes fairies back out of the nursery – and into the gallery.

    Flower Fairies is at the Lady Lever Art Gallery, Port Sunlight Village, UK until 5 November.

    Holly Williams' novel What Time is Love? is out in paperback now.___

    1
    aeon.co How the fall of the Roman empire paved the road to modernity | Aeon Essays

    The fall of the Roman Empire wasn’t a tragedy for civilisation. It was a lucky break for humanity as a whole

    cross-posted from: https://lemm.ee/post/10358195

    > # The road from Rome > > > The fall of the Roman Empire wasn’t a tragedy for civilisation. It was a lucky break for humanity as a whole > > For an empire that collapsed more than 1,500 years ago, ancient Rome maintains a powerful presence. About 1 billion people speak languages derived from Latin; Roman law shapes modern norms; and Roman architecture has been widely imitated. Christianity, which the empire embraced in its sunset years, remains the world’s largest religion. Yet all these enduring influences pale against Rome’s most important legacy: its fall. Had its empire not unravelled, or had it been replaced by a similarly overpowering successor, the world wouldn’t have become modern. > > This isn’t the way that we ordinarily think about an event that has been lamented pretty much ever since it happened. In the late 18th century, in his monumental work The History of the Decline and Fall of the Roman Empire (1776-1788), the British historian Edward Gibbon called it ‘the greatest, perhaps, and most awful scene in the history of mankind’. Tankloads of ink have been expended on explaining it. Back in 1984, the German historian Alexander Demandt patiently compiled no fewer than 210 different reasons for Rome’s demise that had been put forward over time. And the flood of books and papers shows no sign of abating: most recently, disease and climate change have been pressed into service. Wouldn’t only a calamity of the first order warrant this kind of attention? > > It’s true that Rome’s collapse reverberated widely, at least in the western – mostly European – half of its empire. (A shrinking portion of the eastern half, later known as Byzantium, survived for another millennium.) Although some regions were harder hit than others, none escaped unscathed. Monumental structures fell into disrepair; previously thriving cities emptied out; Rome itself turned into a shadow of its former grand self, with shepherds tending their flocks among the ruins. Trade and coin use thinned out, and the art of writing retreated. Population numbers plummeted. > > But a few benefits were already being felt at the time. Roman power had fostered immense inequality: its collapse brought down the plutocratic ruling class, releasing the labouring masses from oppressive exploitation. The new Germanic rulers operated with lower overheads and proved less adept at collecting rents and taxes. Forensic archaeology reveals that people grew to be taller, likely thanks to reduced inequality, a better diet and lower disease loads. Yet these changes didn’t last. > > The real payoff of Rome’s demise took much longer to emerge. When Goths, Vandals, Franks, Lombards and Anglo-Saxons carved up the empire, they broke the imperial order so thoroughly that it never returned. Their 5th-century takeover was only the beginning: in a very real sense, Rome’s decline continued well after its fall – turning Gibbon’s title on its head. When the Germans took charge, they initially relied on Roman institutions of governance to run their new kingdoms. But they did a poor job of maintaining that vital infrastructure. Before long, nobles and warriors made themselves at home on the lands whose yield kings had assigned to them. While this relieved rulers of the onerous need to count and tax the peasantry, it also starved them of revenue and made it harder for them to control their supporters. > > When, in the year 800, the Frankish king Charlemagne decided that he was a new Roman emperor, it was already too late. In the following centuries, royal power declined as aristocrats asserted ever greater autonomy and knights set up their own castles. The Holy Roman Empire, established in Germany and northern Italy in 962, never properly functioned as a unified state. For much of the Middle Ages, power was widely dispersed among different groups. Kings claimed political supremacy but often found it hard to exercise control beyond their own domains. Nobles and their armed vassals wielded the bulk of military power. The Catholic Church, increasingly centralised under an ascendant papacy, had a lock on the dominant belief system. Bishops and abbots cooperated with secular authorities, but carefully guarded their prerogatives. Economic power was concentrated among feudal lords and in autonomous cities dominated by assertive associations of artisans and merchants. > > *** > > Read more through the link. And join lemm.ee/c/history

    9
    aeon.co How the fall of the Roman empire paved the road to modernity | Aeon Essays

    The fall of the Roman Empire wasn’t a tragedy for civilisation. It was a lucky break for humanity as a whole

    cross-posted from: https://lemm.ee/post/10358195

    The road from Rome

    > The fall of the Roman Empire wasn’t a tragedy for civilisation. It was a lucky break for humanity as a whole

    For an empire that collapsed more than 1,500 years ago, ancient Rome maintains a powerful presence. About 1 billion people speak languages derived from Latin; Roman law shapes modern norms; and Roman architecture has been widely imitated. Christianity, which the empire embraced in its sunset years, remains the world’s largest religion. Yet all these enduring influences pale against Rome’s most important legacy: its fall. Had its empire not unravelled, or had it been replaced by a similarly overpowering successor, the world wouldn’t have become modern.

    This isn’t the way that we ordinarily think about an event that has been lamented pretty much ever since it happened. In the late 18th century, in his monumental work The History of the Decline and Fall of the Roman Empire (1776-1788), the British historian Edward Gibbon called it ‘the greatest, perhaps, and most awful scene in the history of mankind’. Tankloads of ink have been expended on explaining it. Back in 1984, the German historian Alexander Demandt patiently compiled no fewer than 210 different reasons for Rome’s demise that had been put forward over time. And the flood of books and papers shows no sign of abating: most recently, disease and climate change have been pressed into service. Wouldn’t only a calamity of the first order warrant this kind of attention?

    It’s true that Rome’s collapse reverberated widely, at least in the western – mostly European – half of its empire. (A shrinking portion of the eastern half, later known as Byzantium, survived for another millennium.) Although some regions were harder hit than others, none escaped unscathed. Monumental structures fell into disrepair; previously thriving cities emptied out; Rome itself turned into a shadow of its former grand self, with shepherds tending their flocks among the ruins. Trade and coin use thinned out, and the art of writing retreated. Population numbers plummeted.

    But a few benefits were already being felt at the time. Roman power had fostered immense inequality: its collapse brought down the plutocratic ruling class, releasing the labouring masses from oppressive exploitation. The new Germanic rulers operated with lower overheads and proved less adept at collecting rents and taxes. Forensic archaeology reveals that people grew to be taller, likely thanks to reduced inequality, a better diet and lower disease loads. Yet these changes didn’t last.

    The real payoff of Rome’s demise took much longer to emerge. When Goths, Vandals, Franks, Lombards and Anglo-Saxons carved up the empire, they broke the imperial order so thoroughly that it never returned. Their 5th-century takeover was only the beginning: in a very real sense, Rome’s decline continued well after its fall – turning Gibbon’s title on its head. When the Germans took charge, they initially relied on Roman institutions of governance to run their new kingdoms. But they did a poor job of maintaining that vital infrastructure. Before long, nobles and warriors made themselves at home on the lands whose yield kings had assigned to them. While this relieved rulers of the onerous need to count and tax the peasantry, it also starved them of revenue and made it harder for them to control their supporters.

    When, in the year 800, the Frankish king Charlemagne decided that he was a new Roman emperor, it was already too late. In the following centuries, royal power declined as aristocrats asserted ever greater autonomy and knights set up their own castles. The Holy Roman Empire, established in Germany and northern Italy in 962, never properly functioned as a unified state. For much of the Middle Ages, power was widely dispersed among different groups. Kings claimed political supremacy but often found it hard to exercise control beyond their own domains. Nobles and their armed vassals wielded the bulk of military power. The Catholic Church, increasingly centralised under an ascendant papacy, had a lock on the dominant belief system. Bishops and abbots cooperated with secular authorities, but carefully guarded their prerogatives. Economic power was concentrated among feudal lords and in autonomous cities dominated by assertive associations of artisans and merchants.

    ***

    Read more through the link. And join lemm.ee/c/history

    0