I've had to switch over to moderated comments because of the astonishing volume of gibberish spam lately. If you post a brilliant comment or clever riposte and don't see it right away, that just means I have to approve it.
I've had to switch over to moderated comments because of the astonishing volume of gibberish spam lately. If you post a brilliant comment or clever riposte and don't see it right away, that just means I have to approve it.
Tor Books are giving away a free review copy of my upcoming novel (gosh I love writing those words) A Darkling Sea. Go to the Tor.com Web site for details.
If you're related to me, don't sign up.
I read this guest post on Sarah Hoyt's 'blog, and it got me to thinking.
As Mr. Begley points out, one of the constant refrains of our time is how "the future" that we're living in doesn't look like "the future" as depicted in mid-20th Century science fiction and popular science articles. We don't have flying cars and Moon bases, but we do have pocket phones with faster processors than any computer on Earth in the year I was born. Our technology has gone in the direction of compactness and precision rather than scale and power.
The question we should be asking is not "Where's my flying car?" but "Why did we ever expect the future to look the way we imagined?"
The Wrong Curve: Many extrapolations of current trends into the future tend to fail because of what I call "the wrong curve." A science fiction writer in 1950 could look at the speed of travel over history, from walking to railroads to airplanes to jets, and see that it's a logarithmic curve bending ever upward. By the year 2000 we'll be going faster than light!
Well, no. Turns out speed of travel is an S-curve: slow build, rapid acceleration, then slows again as it bumps against new limits. Population growth seems to be similar, knocking the legs out of all the overpopulated dystopian Americas that flooded science fiction in the 1970s.
But that isn't the only Wrong Curve writers use. Some just extrapolate in a straight line, which means they fail to see things like the logarithmic curve of Moore's Law, or the cyclical swing of energy costs. Others assume trends are cyclical which aren't -- the Sexual Revolution of the 1960s seems to have marked a fundamental shift in society, not just a passing fad.
For science fiction writers and popularizers, the effects are at worst embarassing. But governments and business get hammered by the Wrong Curve all the time. Right now in the USA we're facing a pension crisis because the mid-century architects of pension systems thought that the number of workers entering the workforce would continue to grow faster than the number of retirees. The President's new health-care system depends on costs and revenues following curves which are somewhere between "highly optimistic" and "sheer delusion."
Missed Boats: One reason some writers' visions of the future pick the Wrong Curve is that they've missed a boat. This actually is worse among serious real-world futurists than SF writers. I suspect the reason is that actual scientists and experts tend to focus on the areas they know best, while science fictioneers are intellectual magpies. So if you put nuclear physicists in charge of predicting the future, they'll concentrate on things like nuclear power and its applications, or the dangers of nuclear weapons, and ignore the Green Revolution or the threat of Islamist terrorism.
The obvious Missed Boat in most mid-century visions of "the future" is computer technology. Not just the curve of improvement, but the applications and effects. Another Missed Boat is the promise of biotechnology. Science fiction writers have been all over that one while the professional futurologists still seem to be trying to ignore the implications.
Ignored Ripples: Even if you guess the right curve and don't miss any boats, you're probably not going to think about the ripple effects of your changes. One can imagine a writer like Isaac Asimov circa 1975 guessing that computers might become small enough to carry in your pocket and put you in touch with a global network of all kinds of information, and he might decide that changes in sexual roles and relations would continue. But would he predict that sending nekkid pictures would become an important political issue?
In fairness, Ignored Ripples are likely the most difficult aspects of the future to predict. In order to make that kind of prediction about the world of the future one would need to possess a nearly comprehensive understanding of the world at present, which nobody really has (least of all the people who think they do).
Non-Eternal Verities: Finally, many visions of the future stumble over the assumption that some things will "always remain true." I suppose this is, in a way, a case of the Wrong Curve -- the assumption that there is no Curve.
One sees this most in social extrapolations. In Rudyard Kipling's "As Easy As A.B.C." the story takes place after a 20th Century plagued by genocidal wars and murderous ideologies, with "crowd-making" a crime and cheap global airship travel -- but the London music-hall circuit is still thriving!
Within the genre ghetto, one can point to any number of distant planets in the far future populated by Texans in Stetsons and six-guns, or Japanese with samurais and ninjas, or Scotsmen afflicted by kilts and haggis. (Sometimes there's a nod to the colony founders deliberately trying to recreate a semi-mythical view of the past; sometimes there's no such thing.)
In point of fact, societies on Earth are a lot more plastic than we realize. We tend to look for continuities and ignore the disconnects. To a modern, the religious and dynastic wars of the Middle Ages make literally no sense. We try to shoehorn the Crusades or the Hundred Years' War into modern narratives of nationalism and economics because we simply cannot make ourselves understand them in their original context. (And when we ourselves are attacked by Medieval warriors motivated by faith, we scramble desperately to discover the "root causes" of their hatred, ignoring their own quite clear public pronouncements.)
To sum up, we shouldn't be surprised that "the future" in which we find ourselves doesn't look much like the predictions of Tomorrowland or the Club of Rome. The surprise is really that anyone should expect to predict the future at all. After all, there are no "historical forces" -- just the choices and actions of billions of humans all over the world. Sometimes their trillions of daily interactions seem to point in a single direction and we recognize a "trend" or a "tendency," or we flatter ourselves that we've discovered a "law" of human behavior.
But humans aren't molecules. You can't set up the experiment again and repeat it. You can only watch the system play itself out, and try to guess what's going to happen next. If you get it right even half the time you're a prophet.
Today I'm proud to introduce the first guest post on this blog, by the redoubtable Sarah Hoyt (who normally blogs at According to Hoyt). Riffing off my blog title, she tells us what sorts of things she says when it's just the caffeine talking, and her new book:
It’s Just the Caffeine Talking
About two years ago I started blogging daily. I think originally I had something like sixty daily readers. Now, except for strange and unexplained dips, I have around two thousand individual ip hits per day, not counting the people who get it on email.
So . . . what am I doing that is so wonderful?
Mostly? Showing up. The routine goes something like this: Stumble out of bed, cross the room to my desk, start typing. Sometimes I have caffeine first.
What comes out is usually what has been in my brain for a while, or something that has.just.got.under.my.skin at that time. Usually visiting Facebook is enough to get something under my skin (and it looks sorta like Alien, getting out.)
The posts range from stuff that just happened in my household to whatever bit of politician or publisher has got stuck between my teeth.
Occasionally there are even cute cat pictures.
So, what do I have to say about my blog as a promotional tour?
Uh . . . I don’t know.
I was told when I started this – which I grant you is a while back – that a blog with a thousand daily readers provided an adequate platform for launching a book. And maybe it does. I just don’t know at this point.
I’ll be able to tell you more clearly once I’ve seen figures on the laydown of my next two traditionally published books. As far as the indie goes, right now I get more mileage out of taking things free on Amazon – but perhaps that’s just me.
On the other hand I enjoy the blog and there’s a good chance I’m creating the sort of dedicated readers who will evangelize a writer – and not just her blog.
And yet on the other hand perhaps it is time I got an annual fundraiser and a subscription system. Perhaps this is needed because of the sheer massive work that goes into the blog.
Work? you say. Work. Look, I don’t write short. Yes, I know, I could. But writing short in non-fiction involves a lot of cutting and rearranging for me. I simply don’t have the time for that. So my blog posts run around 1.5k to 2k words. A day. Weekends included. That is, on the very low side and accounting for the days of cute cat pictures and guest posts at least two very fat novels a year.
Now, that’s not how you measure writing, and it’s entirely possible that fiction and non-fiction come from different parts of my brain. But the truth is when the daily blogging started, the fiction output decreased. Related? I don’t know. If I knew how this here “writer’s thing” worked, I could control it better.
The other consequence of the daily blog is of course that I can no longer be enigmatic as a sphinx when it comes to my personal beliefs. If you write every day, you’re going to end up writing about something you care about. Well, unless you’re really good at contemplating your belly button lint, and mine is too clean to have any.
Which brings us to the function of a fiction writer being to entertain while the function of a nonfiction writer being to inform, to polemicize and/or to incite. At least nonfiction as I write it.
Will the second interfere with the first? Who knows? Of course if I had to choose I would choose fiction over nonfiction. It’s more fun.
But does the nonfiction publicize the fiction? Don’t know.
Mostly I write by the grace of caffeine, and most of the time not even real caffeine, but promised caffeine, i.e. the one I’m going to get truly, as soon as the blog post is written.
And until I get better data, this is what I’ll continue doing.
I do know that boring people at other people’s blogs on a blog tour DOES seem to work. So – clears throat – my book, Darkship Renegades comes out from Baen Books on Monday.
The book is supposedly about a bunch of serious stuff, but really, it’s about space travel, ray guns and a mad cyborg. (And who doesn’t need a mad cyborg?)
So, buy the book. And maybe I’ll continue to be able to afford caffeine.
Look at today's date. It's October 11, 2012. So if you're writing a check or dating a homework assignment, it's "10/11/12." (Unless you're a European, in which case you are doing it wrong.) Mildly amusing, yes? The numbers are in sequence.
What you probably don't know is that time is running out on that particular resource! Mildly amusing date sequences are going to disappear soon! In 2013 we'll have November 12, in 2014 we'll have December 13, and then it's a long drought of mildly amusing date sequences until January 2, 2034. That's twenty years!
Oh, there will be one bright spot in that bleak landscape -- April 3, 2021 -- but it's still a long grim slog until we can again enjoy the thrill of sharing mildly amusing date trivia with the bank teller or the supermarket checkout clerk.
But there is a shining light on the horizon: January 23, 2045. Start planning your party now.
I just bought a new computer -- which is one reason it's been so long since my last 'blog post. Usually I buy a new computer when the old one breaks, but this time I decided to replace the old one while it's still functional. So now I have a spiffy new MacBook.
And I'm disappointed.
In the past, buying a new computer has meant new capabilities. The new machines could do stuff the old ones couldn't. There was new software. New little widgets and applications which were interesting and useful. New music shoveled into the iTunes folder. Having a new machine made new games available.
Not this time. This time feels like a step backward. The OSX operating system doesn't support most of my old software. I'm going to have to spend time tediously converting old documents to some format which the new word processor can read. All the custom fonts I liked no longer work. My games don't work.
I keep asking myself: why do they do this?
When I buy a new blender I don't have to get a whole new shelf of cookbooks because it won't make my old dishes. When I buy a new car I don't have to buy a new kind of fuel. When I get new shoes I don't have to buy new socks or have my feet surgically altered.
Apple Computers has always been the "cool kid" in the industry. They get praised for their attention to elegant design. But this isn't elegant at all. It fails the fundamental test of any new technology: "Does it work better?" Backwards-compatibility should be the first consideration of any new software design.
I think that's ultimately what I find so bothersome. It's not the obvious venality ("Buy our cool new operating system! Now buy all new software!") so much as the stupidity. The inelegance.
For much of my life personal computers have been a technology which empowers, which expands our capacities and our reach. I don't get that feeling any more. This computer is constraining me and annoying me. It's not letting me do what I want.
Do you hear that, Mr. Cook? Mr. Levinson? Sir Jonathan Ive? You're not helping me. Stop making software which makes all my old files obsolete. Make buying a new computer exciting again.
There's an interesting post up at Sarah Hoyt's blog about motivations for villainous characters. She takes the position that no one is consciously evil, and evil behavior is usually motivated by a desire to do good or to fit in.
This is a very common view in science fiction; it's practically the default. With some notable exceptions, science fiction has antagonists rather than villains. It's part of the mindset that goes with writing science fiction: problems are to be solved, the universe is comprehensible, and things follow rational laws. If someone opposes your main character, that person has valid reasons (though they may be based on faulty information).
Evil, with a capital E, is a fantasy concept. It requires the existence of a supernatural yardstick marked with Good at one end and Evil at the other. It requires an absolute standard, and one of the strengths of science fiction is that it treats standards of behavior as infinitely variable.
(This is also science fictions's Original Sin: it treats human behavior as something which is controllable, and therefore the discussion becomes how and by whom it should be controlled. And that's a very slippery slope above a very deep pit with bodies at the bottom.)
However, I'm afraid I must disagree with Mrs. Hoyt about this. There are people who knowingly do evil. Consider career criminals like Mafia members. They're under no illusions that what they're doing is right or good. I recall reading one account of a Mafia soldier who had plenty of his own money but preferred to take his girlfriend out on the town with a stolen credit card. Screwing someone else over just made the whole evening more fun.
Maybe this is a sex difference. Part of being male is the desire to be the top male, and that is accomplished in part by stamping down all the others. Being the baddest dude around is fun. There's a solid biological basis for this. When you believe you're high status, your body makes more serotonin, and that makes you feel good.
To a very great extent our whole concept of good and evil developed to preserve the social group against the innate drives of its individual members. Our brains invented right and wrong to help keep our bodies in line, and societies where the brains won prospered and perpetuated themselves.
But those drives are still there, and there are still plenty of people who indulge them. And plenty more people who fantasize about indulging them (because we all chafe at our roles in society), which is why a good villain can make a story.
My story "The Eckener Alternative" is now available via podcast, on the venerable Escape Pod audio webzine (or whatever the kids are calling them this week). Download it and listen on your next boring commute through time!
Today was the first session of this year's Launch Pad Astronomy Workshop at the University of Wyoming in Laramie. I'll give my impressions in roughly chronological order.
Laramie itself is a nice place, somewhere on the borderline between "town" and "city." The University sprawls across the eastern side of town, there's an immense railroad yard cutting through town like a river, and overall the place is an appealing mix of college town and cow town.
The workshop convened at 9:30 a.m., and it's certainly an impressive group of people. Participants include Michael Albo, K.C. Ball, Cecil Castellucci, Me, Greg Fishbone, Liz Gorinsky, Shariann Lewitt, Shelly Li, An Owomoyela, Deborah Ross, Christopher Rowe, Eric Stone, Todd Vandemark, Jennifer Willis, and Danielle Wolff.
Our instructors were Dr. Mike Brotherton, Dr. Jim Verley, Dr. Stanley Schmidt, and the alarmingly polymathic Dr. Henry Stratmann.
Fortified with Pop-tarts and homemade scones we settled down to enlarge our minds. Dr. Brotherton started things off by trying to give us a sense of the scale of the Universe, taking us from a space the size of our classroom to the entire observable Universe in hundredfold jumps. He introduced us to some favorite units of astronomers, the Astronomical Unit, the Light-Year, and the mighty Parsec. Since his lecture took us outward from Dr. Brotherton himself, he modestly confessed that "I am the center of the observable universe, but it probably doesn't revolve around me."
The legendary Dr. Schmidt followed in the same vein, walking us through the creation of a scale model of the Solar System. If the Sun is a thirteen-inch globe in the classroom, the planet Mercury is a mustard seed out in the hall 45 feet away, Venus is a peppercorn outside the building, Earth is another peppercorn a block away (circled by its poppyseed Moon at a distance of three inches), and so on. On that scale the nearest star system, Alpha Centauri, would be in Tokyo.
Dr. Verley spoke about common misconceptions and how to deal with them. As an illustration of how everyone is vulnerable, he showed some rather painful videos of Harvard graduates (and faculty) completely blowing the questions of what causes the seasons and the phases of the Moon. We all had a good laugh at the Ivy League idiots, but Dr. Verley's point was that everyone labors under scientific misconceptions, and it's very difficult to pry an idea out of someone's head once it's gotten in.
The final lecture was by far the most enthusiastic, as SF writer and amateur astronomer Jerry Oltion spoke about amateur astronomy -- how telescopes work, how to make them, what to look at once you've got a telescope, and how amateurs can sometimes do professional-caliber science.
With that, we broke for dinner and blogging, with the hope of clear skies for observing after sunset. Tomorrow: PHYSICS!
Apologies for the long hiatus. Among other issues, we took a family vacation and my internet access was limited. Anyway, I'm back.
Yesterday I happened to read a posting on the usually interesting io9 blog (the science fiction branch of the Gawker Media empire) -- this one, about the tension between expansiveness and self-referentiality in serial media. Solid stuff.
And then I read the comments, which veered off into a discussion of copyrights, including a fair amount of gloating about the prospects of iconic media characters like Mickey Mouse or Superman falling into the public domain. There was also some booing and hissing about how the giant media companies have been lobbying for longer and longer copyright terms.
There's plenty of room to argue about the merits of limited vs. perpetual copyrights (full disclosure: I'm with Mark Twain on this one), and I won't go into that. However, I am kind of puzzled about the glee with which people contemplate the idea of Disney no longer owning exclusive rights to Mickey Mouse.
Is it just envy, tricked out as fashionable anti-corporate sentiment? Or are there people out there who actually feel injured somehow by Disney's copyrights? How could that be?
Let's look at some possibilities.
Suppose I'm burning to create my own Mickey Mouse story. Perhaps I have a cool idea for a Mickey comic or film. Well -- if the idea is really good, or if I'm really dedicated and make my own short film -- it just so happens there's a huge media company which really likes Mickey Mouse. I can parlay my Mickey story into a career working for Disney or some of its many, many licensed publishers.
This isn't hypothetical, either. Cartoonist Don Rosa was a huge fan of Carl Barks's Scrooge McDuck comics, and eventually wound up working for comics publishers doing licensed Disney comics. In other media, Paramount's Star Trek franchise was pretty much taken over by former Trekkies.
Okay, you say. What if my "vision" for Mickey doesn't agree with Disney's? What if I want to do a darker and edgier Mickey, or explore his relationship with Minnie in explicit detail?
Well, I can draw my own funny animals and have them do whatever freakish acts I want. Call them Mike and Millie Mouse. There's a whole genre of "furry erotica" comics, and I can probably get them published. "But that's not Mickey!" you say. "Your freedom to show Mickey Mouse doing kinky sex is curtailed by Disney's copyrights!"
True -- but if the only reason to do kinky Mickey is the frisson of turning a child-friendly corporate icon into a pervert, then that thrill is destroyed as soon as anyone can do it! When Mickey enters the public domain, there will probably be a whole rush of attempts to do just that, and they'll all fall flat because it won't be transgressive anymore.
And note that I can write my own Pervert Mickey comic, or even make a film -- I just can't sell it and make money at it. Which means the beef about copyrights isn't that it infringes artistic freedom at all. The problem is that those greedheads at Disney are preventing other greedheads from making money off of Mickey.
If anyone has any ideas about how copyright actually inhibits creativity, please comment and explain. I'm serious; this is a big issue nowadays and I'd love to hear other viewpoints.