Oliver Stone’s Bush biopic is all about a confused man-child with daddy issues

March 3, 2009

FROM JASON’S RENTAL CARD — Just because you’re simple doesn’t mean you’re uncomplicated.

That’s how I felt about George W. Bush — or at least his silver screen caricature — after watching Oliver Stone’s W. Sunday night. That, and surprisingly a small amount of pity for a man whose policies I’ve despised and whose actions I’ve cursed.

I told Andrew after watching the film that it’s too apologetic, too humanizing of the 43rd president. It gives ol’ W. a bit of leniency by showing his Oedipal angst and constant quest to find self-worth despite his skin-deep Texas swagger. Stone pushes the younger Bush as a man-child desperately seeking his father’s attention and trying to come to terms with his lack of career acuity, and it feels like a back-handed sympathy party.

From his failure to make it as a blue collar salary man, to his drunken Harvard fraternity nights, and then his coat-tails ride into the political arena, Josh Brolin as Bush seems more a confused teenager in an adult body than the evil corporate oilman his opponents have labeled him.

brolin-bushAnd trust me, the guy from The Goonies (Brolin) is good. The face is Brolin’s, but the trademark derisive snicker is Bush’s, as is the Lonestar State strut and the halting delivery of contorted Bushisms lifted straight out of the newsreels. He infuses W. with a mannish petulance, showing Bush trying desperately to maintain a pretense of control as his decisions constantly kick him in the groin.

It’s the facial expressions, really, that clinch the performance. Brolin gives the recognizable Bush squint while mulling the really tough ideas, radiating the idea that if he can only knit his brow a little tighter then he might be able to pierce the veil of information around him and find out what is really going on, and why his policies are having such disastrous consequences.

Brolin and Stone also dally a bit, much to my delight, on the right-wing religious angle, making a fairly acute statement on the pandering of Bible Belt politicians.

“Nobody’s ever going to out-Texas or out-Christian me again,” Brolin-as-Bush says after losing his early Congressional bid. He spends the rest of the film pausing frequently for showy prayer breaks and even telling his preacher that God is speaking to him audibly.

My stance on such things: Hearing imaginary friends talking to you is a sign of paranoid delusional schizophrenia.

Bush is in the reticule with this one, but Stone doesn’t miss an opportunity to skewer Dick Chaney (Richard Dreyfuss) as a manipulative, power-hungry warhawk; to simultaneously golf clap and give a shame-on-you to Colin Powell for his role as a Bush enabler; to jab at Karl Rove’s smug calculative nature; to borderline impune Donald Rumsfeld as certifiably insane; and to cast Elizabeth Banks as an (unrealistically) sexy version of Laura Bush.

I don’t know exactly why W. scored just a 59 percent rating on Rotten Tomatoes, but I’d be willing to guess it has to do with the political charge of the film; it scores slightly higher with a 6.9/10 rating on IMDB.

Personally, I’d recommend it slightly higher than either of those metrics, but with the admonition that it’s not going to spur much demand for repeat viewing. I definitely wouldn’t buy W., especially considering how it will be dated as we put the Bush presidencies behind for good.

It will be interesting to see in eight years whether Barack Obama will require Stone to rev up the camera for a similar treatment.

Advertisements

Fireworks and a ‘history’ lesson

July 4, 2008

FROM JASON’S TIME CARD — You know what’s cool? I got to “cover” my city’s annual fireworks display last night. It was a fluff assignment, and a welcome break from shootings, stabbings, and fiery deaths I’ve been handling lately.

Even better: While I was hanging out with my photographer near the fireworks launch site, I overheard the following (and very edutaining) conversation:

“Grandpa, why do we have fireworks?” asked a sandy-haired young boy who looked about 8 or 9 years old.

Grandpa never batted an eye. There was no ironic smile. Completely convinced of his historical accuracy, Grandpa learnedly replied, “Because in 1774, the British gave American independence and they celebrated with fireworks.”

Hail, the American public education system.


YesterGames #5: Commander Keen in Goodbye Galaxy (Secret of the Oracle)

March 12, 2008

FROM JASON’S CYAN AND MAGENTA SCREEN — It’s hard to imagine, but there was a time when side-scrollers didn’t work on the PC. Long after the folks over in Japan had figured out how to Mario themselves into Scrooge’s Money Bank-esque piles of cash, the PC was still lagging dangerously behind.

In a way, all of the modern computer games — Bioshock, Portal, Crysis, Sins of a Solar Empire, Supreme Commander, everything — owe all their success to a little 1990 game called Commander Keen (download). Its code surmounted a major problem facing PC gaming: the lack of parallax scrolling.

Inspired by Duck Dodgers in the 24th 1/2 Century, Buck Rogers, and other old radio serials, Keen tells the story of Billy Blaze, an 8-year-old boy with an IQ of 314 who journeys across the galaxy trying to thwart his nemesis, Mortimer McMire. Interestingly enough, Billy’s backstory was re-written after the release of Wolfenstein 3D (both created by Id Software) so that he was the grandson of Wolftenstein hero B.J. Blazkowicz.

Keen creator Tom Hall discovered a coding trick that allowed smooth scrolling on the EGA graphics card/CRT. His first move was to port the first level of Super Mario Bros. 3 to the PC and try to sell Nintendo on getting into the home computing market. Nintendo purportedly came close but eventually declined, and Hall (and collaborators) decided to make an original game.

The account of that venture is pretty widely established and you can read the 3D Realms version if you want. I don’t think it’s necessary for me to rehash it.

Of all the Keen episodes — there are six, including a Gameboy Color title — I think Secret of the Oracle (the first half of the Goodbye Galaxy story arc and the fourth in the series) is by far the best. First of all, it was the first to be backward-compatible with CGA monitors, which meant I could play it in its four-color glory: black, white, cyan, and magenta. It also boasted non-linear level selection once the first two stages were completed.


Keen’s level design was tops in 1991. This player knows what he’s doing.

But maybe the best thing about Oracle was the level design. These were still the days of randomly floating platforms and floating chochkes, but in Oracle alone did the Id team manage to make these elements look somewhat naturalistic and contiguous. The 2/3 view didn’t hurt, and the large, solid background elements like trees and desert, houses, the infamous slug statue, and Billy’s rocket ship added a sense that this wasn’t a world made up of just 16×16 sprites.

I also think a big reason why the early PC gaming community adopted Billy Blaze as its ad hoc mascot was because he’s so geek-relatable. Computers in the pre-Windows days weren’t exactly user friendly, and not everybody was savvy enough to get drivers to work or even learn commands for DOS (or DOShell). Those who developed even basic early PC literacy were pretty bright and I, especially, felt like I could identify with a kid genius slinging lines of DOS syntax and BASIC commands.

Maybe that’s a little narcissistic, but that’s how I felt as an 11-year-old 3.5-inch disc jockey.

Leave Billy alone long enough in-game and he’ll sit down and read a book — just like me. He’s also got some young punk cred; a trick in the Temple of the Moon level will make him moon you. He’s got that superior cocked eyebrow going on in the title screen. He’s also got that slightly lopsided grin that maybe I stole from him subconsciously.

One last thing: I always felt there was a little bit of ambiguity in the Keen games about whether the events were really taking place. The narrative always played it straight: Yes, Billy was really planet-hopping to fight the Vorticons et al. But I always thought that the entire Keen world might just be a byproduct of Billy’s imagination. I mean, I’m not too proud to admit that as a small boy (age 16 or 17 or 24) I would don a football helmet, grab a Captain Power lightgun and rush around the basement acting out some epic quest. I wonder if that’s all Billy was doing and if that means the surrealism of the game was entirely a figment.

There are better platformers out there now, or course, but Secret of the Oracle still holds up remarkably well (if you’re slightly forgiving). It certainly looks better than many, many Famicom Nintendo titles from the same era. Hall continues to waffle about the future of the franchise — he doesn’t have the intellectual rights anymore — but says he wants to someday develop another episode.

Let me say this: If a Keen-a-la-Mario64 reimagining hit the Nintendo DS today, I would pay double the retail price to get it.

NOTE: I was already planning to talk about this game, but Ninjarabbi gave me a kick in the butt. I hope Scrym talk about Keen on Geeknights soon.


YesterGames #3: Where In the World Is Carmen Sandiego?

March 6, 2008

FROM JASON’S GLOBE-TROTTING COMPUTER — My biggest fear, the one paranoia that keeps me awake some nights, is that we’re breeding idiots. The only thing keeping hope alive is Where In the World Is Carmen Sandiego (download).

In America, we’re very good at certain things. Geography is not one of them. A 2006 survey conducted by the National Geographic Society found half of young Americans can’t find New York on a map and only 37 percent can find Iraq. The Society gave 510 U.S. citizens a weighted geography test and found that youngsters answered about 54 percent of the questions correctly, while most adults ages 18 to 24 failed.

And we’re not just talking about being able to label state capitals, here, folks. My fellow Americans don’t understand much about foreign culture, language, religion or history. Three-quarters of those tested didn’t know that Indonesia is a predominantly Muslim nation, and the same number thought English is the most-spoken language in the world (it’s actually Mandarin).

The study also found:

  • 75 percent could not find Israel on a map.
  • 44 percent could not find Israel, Iraq, Saudi Arabia, or Iran.
  • 88 percent could not identify Afghanistan on a map.
  • 54 percent did not know Sudan is in Africa.
  • 40 percent did not know Rwanda is in Africa.
  • 35 percent were able to identify Pakistan as the country where 70,000 people died in an earthquake in October 2005.
  • 67 percent were able to find Louisiana on a U.S. map.
  • 52 percent were able to find Mississippi on a U.S. map.
  • 69 percent found China on a map — and it registered as one of the few recognized countries outside of North America.

Public schools keep churning out geographically and historically illiterates, but I credit software developer Broderbund with doing more to further my knowledge of those subjects than any teacher or class.

The Carmen Sandiego series of computer games was born in 1985 with Where In the World Is Carmen Sandiego, which would run on ludicrously slow computers and CGA monitors. It had the advantage of being prevalent in a time when edutainment software still had a viable share of the market, and I remember playing it at school and then begging my parents to buy a copy for home on my 286.

The game is little more than a test to see if you know that Indians speak Hindi, that Tokyo is a world electronics capital, that the Aztecs ruled what is now Mexico, that the Niger River is in Africa, that sherpas can be found in Kathmandu, and that Ferdinand Magellan didn’t quite circumnavigate the globe.

But it’s disguised as a crime caper, allowing you to chase down goofy suspects who’ve stolen impossible maguffins — like the Leaning Tower of Pisa — and gone on the lam. Using clues gathered as you fly around the world, you have to stay on the thief’s trail, get a warrant, and make an arrest.

Sure, there are some softball clues lobbed in there (“She was asking what the exchange rate is on the peso.”) but there are also some brain busters. I played this game for two hours Monday and Tuesday and was hooked the whole time, smiling stupidly to myself as I relived a huge part of my childhood and stretched my brain.

I also couldn’t get the theme song from the Carmen Sandiego game show on PBS out of my head. When I was 11, I watched every afternoon at 5 p.m. and was howling pre-adolescent profanity at the screen because the questions were so easy.

You’ve got to watch this. Full episode ahead:

If you have a student age 8 to 13 (or maybe a little older if they aren’t wusses), I can think of no better learning tool than Where In the Wold Is Carmen Sandiego?. The 1990 deluxe edition can be found at Home of the Underdogs, or if you have the scruples it can still be purchased from Broderbund for $10.

HOTU also has downloads for:


The Rise of Personal Computers and the Revolutions That Caused Their Evolution Part 2

December 5, 2007

The ENIAC computerFROM THE STUDYING MIND OF ANDREW–The breakthrough of the transistor heralded a new era for computers. First generation computers such as ENIAC and EDVAC were massive machines which spanned whole rooms and contained thousands of vacuum tubes. While this design worked, it left little room for improvement as the method would require an increase in size to produce an increase in speed. However, the transistor brought an age of miniaturization, effectively replacing the vacuum tube. The first transistor was based upon a germanium semiconductor base and used a diode to control the flow of electrons. This was known as the grown-junction transistor. However, this was not the final form. While this grown-junction transistor was important, the design was unstable. Germanium, while holding the potential for great speeds, would prove to be too unreliable for any commercial use. Silicon was later found to be a much more suitable semiconductor to use in transistors.

This discovery is what jump started the transistor into the mass market. It is a perfect example of the evolutionary process that defines the world of computing. This is the exact form of development that Ralph Gomory, mathematician and IBM executive, believes is important in studying computers. He finds that the world of computers is defined through an evolutionary process, as are almost all technological innovations. Yet, while the evolution of the transistor into its many forms was important, he believes that the creation of the transistor was a true revolutionary breakthrough, a term that he admittedly does not use lightly. Gomory states that this computer evolution is one of equal magnitude and importance to that of the steam engine.

The large computers which made up the first computer era were extremely costly which often restricted them to government and research uses. The main form of interaction with these computers was through the use of punch cards. These punch cards would require holes to be made within certain areas to represent the binary logic behind a program. In time, computers began to utilize the transistor and various other components to make computers more reliable and manageable. However, these new transistor-based computers were not much of an improvement for the average consumer. It took a new invention to lead towards the second generation of computers.

In 1958, the integrated circuit was developed by Jack Kirby at Texas Instruments. The integrated circuit took all of the components of electronic circuits and miniaturized them on to a single semiconductor substrate. The effects of the integrated circuit led to the development of the microprocessor, an advanced integrated circuit which became known as the Central Processing Unit (CPU) of the computer. The first microprocessor to hit the market was Intel’s 4004 processor which was released on November 15, 1971. The 4004 held 2,300 transistors and had a modest clock speed of 740 kHz.

This evolution began to bring the world of computing into John Doe’s hand. In 1974, Arthur Robinson wrote an article on the change of computing from the “maxi-level to the micro-level.” At this stage in time the microprocessor was still a relatively new device, yet he has several predictions about the far reaching imprecations of the device. At this point in time, the only real encounter that the average person had with the world of computing was the first few hand-held calculators began to be sold (most notably by Texas Instruments).

However, Robinson saw other uses for the microprocessor including uses in factory machinery, computerized cash registers, and computer terminals. What is important to notice is his expectations for the future of computing. While he does not outright acknowledge an abstract personal computer, he gets close to the idea. His idea is that the main user of the microprocessor and the microcomputer would be OEMs (original equipment manufacturers). OEMs would develop unique machines with microprocessors embedded in them that carried out individual jobs, each one fine tailored towards a specific goal or task. For example, he discusses supermarket terminals. These terminals would be used to keep track of inventories, authorization of credit cards, and price look-up. However, he stipulates that these terminals would be linked towards a central computer, much like the terminals of the first generation computers were used.

Final installment to follow…


The Rise of Personal Computers and the Revolutions That Caused Their Evolution Part 1

December 1, 2007

FROM ANDREW’S HTS 3083 TERM PAPER — The 20th century has seen giant leaps in several technologies, stemming from a massive increase in research and development on all fronts.

With the driving force of two world wars and a massive cold war, it became ever more important for countries to spend billions of dollars in development of new technologies that would drive their society, both militarily and culturally, towards dominance.

The most important of these technologies is without a doubt the rise of the computer and its permeation into the personal market. The evolution of computers would never be defined by a single invention, but rather an ideal to push towards a singular goal: the personal computer. Closer examination of the history in the computer’s evolution shows an important trend in which affordable and powerful personal computers were the first step in a digital revolution which would drive society both technologically and culturally into the 21st century.

It is difficult to lump the computer into a single invention that could be analyzed. The computer itself is not a singular technology that was just created overnight but rather a major basis for the electronic and computing system. This system is not separate from human society and culture, but would instead ingrain itself into the very fabric of society. Therefore, this paper will focus on the introduction of two key inventions which led to the evolution of this computing system. They are the transistor and the microprocessor, two major components which define the computer and electronic era.

The first step in understanding the evolution of the computer is the study of the engineering and technological efforts after the Second World War. With inventions such as radio and radar becoming increasingly more valuable in the defense market, governments began to view technology as a vital player in the international marketplace. Government spending in research and development began to increase dramatically as well as drive the private sector into the game. It also showed that scientists would play a key role in the post-WWII era.

One of the main hotbeds for this research was Bell Laboratories, an offshoot of AT&T and Western Electric Research Laboratories. Immediately after the war, then director of research at Bell Labs, M. J. Kelley, decided to investigate the role of semiconductors, elements which in the past had several uses. Semiconductors were common components in crystal radios before the invention of the electron tube was invented. The main goal of this research as was to “gain a deeper understanding of the physics of these substances (semiconductors).”

The researchers began to realize the potential in such materials and decided to take another look in a device which could control the flow of electrons in solids. This research eventually led to the invention of the transistor by Walter Brattain and John Bardeen. In his article published in Scientific Monthly, Ralph Brown, director of research the year the transistor was invented, stated that when they publicly announced the invention of the transistor, they would remain quiet on their hopes and dreams and rather just announced what they could certifiably state as facts.

In retrospect, Brown states that the act of announcing only accomplished tasks that the transistor had performed led to misunderstanding and lack of interest by the news media when reporting on the invention.

To be continued.


Andy Kaufman should get a producer’s credit for America’s wakening to solipsism

November 28, 2007

tonyclifton.pngFROM JASON’S COPY OF THE GREAT GATSBY — It’s hard to tell whether Andy Kaufman hated his audiences or was some sort of messiah sent to raise them to a new state of mind.

His goal, I think, was never to make the crowd laugh. He insisted he wasn’t a comic and didn’t tell jokes — unless it was to show how flawed conventional humor was.

“I’m not trying to be funny. I just want to play with their heads,” he told The New York Times.

Every Kaufman bit forced the audience through the gears and far past the confines of conventional humor. He made them squirm. He pushed discomfort to an art. He was a study in negative space and his audience’s reaction to it.

In that way, he was a masterful deconstructionist. He wanted to turn the entire idea on its head. He wanted to try reverse-reverse-reverse-reverse psychology. He wanted to piss people off, and he never made a secret of it — especially in his staged inter-gender wrestling stunts and legendary appearance on Fridays later in his career.

But angering people wasn’t the end goal — it was just a necessary transitional state. He always aimed to show people how to peel through the fake veneer of life and find the elusive truth underneath.

“What’s real? What’s not? That’s what I do in my act, test how other people deal with reality,” he said.

It was a concept that few people understood, especially the network executives he asked to back him. But he bludgeoned his way through show business anyway, pummeling the American public with a do-you-believe-everything-you-see solipsism that was infectious to an entire generation. He flippant attitudes toward what could be done or said on television changed the perspective of the multitudes, even if they didn’t realize it at the time.

It was like Kaufman was trying to be unpopular, just to prove how silly the entire notion of culture is.

“There’s a little voice that says, ‘Oh, no, you can’t do that, that’s breaking all the rules,'” he said. “That’s the voice of show business. Then this other little voice says, ‘Try it.'”

Watch how he breaks the crowd in this 1977 HBO Young Comedians special. The audience members don’t know whether to take him seriously. They don’t know up front whether Andy’s stuttering, hesitant, self-effacing front is real. Andy keeps pushing and pushing the limits of their credulity, then slaps them a little in face to let them know it’s all an act.

Once he had disabused the confused masses of their expectations, he would show them his own home-brewed physical comedy.

It was so tangential to their expectations that they would be just excited and confused enough to fall prey to his abusive alter-egos. Here, Tony Clifton launches a raunchy assault born in the night clubs of both Reno and Tahoe.

Note that Mel Sherer is a plant — he helped Kaufman put together his “Andy’s Playhouse” special that (I think) never aired on ABC. Bob Zmuda was his obvious sidekick, though it’s unlikely the audience had any idea, and Larry Feinberg and Luther Adler were both Jewish comedic actors.

Many mass media outlets that clamored to interview and review the hot new “comedian” didn’t know that he and Clifton were one and the same. Sometimes, in fact, they weren’t — he would give his brother, Michael, and good friend Bob Zmuda turns depicting Clifton — again, just to mess with peoples’ minds.

After he added makeup, shades, and a little bit of weight to the Clifton costume, the gag was so convincing that it continues to baffle fans. Continued Clifton appearances post-Kaufman’s death of cancer in 1984 have even added fuel to the popular theories that Andy may have faked his own death.

Before his death, he was working on a script about a man who fakes his own death. He told others he wanted to actually do it as a type of performance art. Zmuda even said Andy was obsessed with the idea. But Kaufman did not rise from the dead to revel in the success of his hoax in 2004, as he bragged he would.

But that’s not the point. Kaufman still succeeded by doing what any good absurdist or mentalist does — he convinced us that it was possible that he wasn’t dead, and he kept us talking about it for 23 years. That’s a bigger trick than most men can ever hope to spring, and it’s what made Kaufman’s anti-comedic outlook on life so revolutionary.