A tar heel that fails to mention our own B Movie great Earl Owensby? (cavet-I was in "Dogs of Hell")
I love this article so much.
Just a Question was the 2-headed shark done by Cleve Hall and Total Fabrication, Inc.? Ya know the guy from 'The Monster Man' Series, new on SyFy?
I've never seen any of these, but I'm guessing they make Full Moon movies seem like Fellini.
Woah. Today I learned that people still go to movie theaters. I had suspicions.
How about upping the frames-per-second and drop the gimmicky 3d bullshit?
Ron--3D, yes, probably. But I was married to the assistant to the president of a major studio, and I feel safe in saying that a) it's gross ticket sales they go by (not accounting for costs of any sort), and b) they were doing this long before the press made a fetish out of it. I'd say the problem with the press is lack of interest in tracking when you are seeing a projected DVD or 35mm print or whatever inbetween. Not that doing so wouldn't pose challenges--one of which is dodgy theater owners refusing to specify to papers what the format will be (yes, that has happened in Portland).
Ironically one of the major drivers for digital cinema is the press.
Films are judged by press reported opening weekend box office sales. More screens (without more expensive 35mm prints)=more opening revenue. 3d is also entirely digital, so more 3d, with higher ticket price=more revenue.
Larger opening weekend box office means more momentum, millions saw it so it must be good.
hi graham, here is your answer, sort of.
while the goal of both digital and analog projection is to fool the eye into thinking that a series of still images is actually a smoothly moving single image, the technical ways in which the two formats go about doing so is quite different. film relies on a phenomena known as 'persistence of vision' - pioneered by our old friend Eadweard Muybridge. with film you are literally seeing a series of flashing still images that are each followed by a brief moment of complete darkness. our eyeballs retain the image just long enough so that during that moment of blackness the projector can advance the film one more frame and then quickly flash it before our retention of the previous image fades. if you had bionic eyes or could slow down time you would see that a movie theater's screen is actually dark for half of the entire film projection.
digital projection relies on elements known as scan and field phasing (or something like that) - where as each image is literally pushed off the screen by the next. there is never any 'dark' moments on the screen, and in fact if you had bionic eyes you could actually see weird mutant half frame images throughout the showing. actually, you don't really have to have bionic eyes to see it, you just have to be sensitive to it or know what you are looking for.
i am not here to say that film projection is better or worse than digital, but the physical interaction each has with your eyeball and they way the images get to your brain is very different. when i see top notch digital projection, my first impression is usually 'wow' but then by the end of the movie i often find my eyeballs feeling strained and i feel like i am getting a headache. and it seems like a lot of people are in that boat.
Ugh, I guess I just wasted my time on a troll, having looked at Graham's other comments (sample: "What's a hipster?"). Hopefully this was useful to someone who honestly wonders what the fuss is about anyway.
First off, Garahm, I have probably never been at a digital screening where I couldn't at some point tell the difference. And most, although not all, of my problem concerns all the screenings that are done on something less than the highest-end digital projection. When Cinema 21 showed Metropolis digitally (something they claimed was the only option, although it did not play in NYC until they had a print) I knew instantly. Instantly. I had no reason to expect anything other than a print--otherwise I wouldn't have been there. The lines of introductory text were clearly jagged. Did anyone else know the difference? Apparently not, and if they did they judged it minimal. To me it looked horrible, so there was no point in staying. And that was a decent projection considering a lot of what's done, whether it's at PIFF or the Baghdad or Clinton St or NWFC. I've seen a lot of video projection at those places that compared to fullscreen youtube video or worse. IT just looks WORSE--like I said before, the digital seams show. I have seen projections of vastly higher quality, and those can have other issues. In those cases, the text onscreen will not be jagged but eerily clean and exact, without the soft, grainy give of film, as though the credits are being superimposed onto the image live by computer rather than integrated as one with the image. However, most of the time, credits aside, it will look like a brand new, perfectly projected print. That's great. Except when it doesn't. Usually the brighter scenes will reveal to me a sort of dull, thin flatness, especially open sky, or bright fire. There is not as much room for bright, slight detail to blend in well with the rest of the image. Digital just doesn't have that capacity. Film can never fail in that regard, because the image is just there. Digital strives to replicate. Film just is. It's not philosophical, and I'm not talking about appreciating art (nor was I before). Film always has all of it's information, digital seeks to make you think it's there. The grain and physicality of film causes the information to merge seamlessly with itself regardless of age. Digital leaves gaps. Seeing True Grit on the highest-end system was basically ruined for me. Two thirds of the movie looked great and I would have never known. One third reminded me I was watching something ersatz. Even without those flaws, I would prefer celluloid. What I was trying to say before is that film can never degrade to the point where it's flaws are anything but natural and organic in look. Digital's flaws are not. It's like the difference in snowy static being mixed in with your image in an old broadcast (if you are old enough to know what I am talking about), versus a perfect high definition image that disappears into jagged blue sputters when the reception is poor. You, Graham, like the jagged blue sputters, or more to the point, cannot see them, while we can see them and would prefer to have an image that includes a little snow mixed in with the image instead if the reception is poor. I do have a romantic notion of film, and love to see battered old prints, but that's not what this is about. If I see a digital presentation that leaves me unaware that I am not seeing film, I can live with that. But the substandard visual experience digital currently provides has not brought us to that point. You are at that point, clearly, and don't feel bad. You are not alone in not being able to distinguish the difference. But why come barging into a funeral, yelling how the smiling android of the deceased is a perfectly nice guy and we are all being silly? You can read all the studies of digital resolution you want, but the proof is there on the screen for some of us, and that's all there is about it.
#2.... Because he makes some kick-ass movies and I find his thinking more relavant than yours. Speilberg prefers film too, among others.
I have read too that digital films differently in low light than digital. I imagine there are many more aesthetic reasons, which are identifiable to those with a keen eye.
@Ovidius: What does film grain accomplish? My smugness is that no one has presented a cogent argument on why film is a better format than digital. I keep asking people to do so and here's the list of reasons why:
1. nothing can replicate the look of light projecting a perfectly struck print onto a screen (no reason given and most likely incorrect)
2. Tarantino likes film (why is this relevant?)
3. an aesthetic one (not explained in any detail)
4. avant-garde films created with a film projector in mind (legitimate, but such a small market as to be largely irrelevant)
5. a well-restored print is a far better experience than a digital projection (no reason given)
6. an argument about having soul and appreciating art (a facile argument)
I am waiting for someone to explain to me why film is better than digital. I fully understand that nostalgia is important for many people and that some movies have an artistic vision that is better rendered by film; but that's not the issue here. The issue is the other 99% of movies. Please explain why film is better.
@Graham: I have no idea what you're talking about. Unless you mean that film grain is some kind of visual flaw that needs to be corrected by technology, in which case I find your disinterest in this subject weirdly disproportionate with your arrogance. If you don't care about the differences between film and digital, then I don't know what accounts for your smugness and defensiveness, unless of course you just like to feel superior. I don't suppose you've given this much thought before, until the opportunity to play provocateur presented itself.
@I,A & Ovidius: So the argument in favor of film over digital is, "it looks worse, but it's always looked worse; so we shouldn't improve things"? This IS all boiling down to the same audiophile argument over why record players sound better than MP3s.
Guess what? No one but an incredibly small minority cares. Unless you can make convincing arguments as to why your whale-bone corsets are still useful, you're going to be relegated to the dump-heap of history.
All this nonsense about lines of resolution is beside the point--if you are incapable of telling the difference between celluloid and the digital, you have nothing to worry about. If you think that a well-worn piece of 40 year old celluloid falling below the crisp ideal of Blu-Ray somehow proves its inferority, than you are in luck! The rest of us, those with souls and the ability to watch something flawed but still beautiful over something perfect and sterile (when not showing its digital seams), are screwed. DA Pennebaker recently was forced to spend 15,000 bucks cleaning up a print to show for one weekend somewhere, because digital has led people to become incapable of viewing a print that has flaws. That's ridiculous. I'm the guy who has tried occasionally over the years to warn about this, and to get "film lovers" like Ranieri and Sonstein to prove it and TELL us when they are showing digital, including via that I, Anonymous piece. If you have noticed, they DON"T do that (adding to the culture of complacency with digital that hastened film's demise). What both do instead is mention only when a film is 35mm, as if they are delivering a special treat to you. This backwards-thinking advertising evasiveness is quickly becoming the approach that makes sense, since in a year or two we can safely assume that what we are seeing is digital unless told otherwise (execpt at NWFC, perhaps, which still refuses to tell us what we will be watching until you get to their front door--thanks bunches!). I have felt like a voice in the wilderness the last few years, but now that the end really is here, I am honestly shocked at how quickly and thoroughly it is coming, like a man standing with his "the End is Near" sign, going, "Really? It's TOMORROW?"
I at least appreciate that Erik did this piece, esp since he has come down so squarely on the side of digital projection in the past. At least he can appreciate the ramifications on others.
A good Blu-ray will preserve the look of the grain, but the big studios sometimes go overboard with DNR, creating weird, waxy-looking transfers. I read reviews before I buy any Blu-rays from Fox, Warner, etc.
@Graham: And if you don't care for film granularity, I'm sure those Blu-rays are stunning.
@geyser: The 4k-6k numbers I've seen used are all in regards to the theoretical maximum of what 35mm film is capable of. As with all analog copies, there is considerable dergradation of the content with each generation. The studies I linked to seemed to correct for a lot of the projectionist issues. Also, the 100th showing of the digital copy will look exactly like the 1st showing; something that film is obviously going to fail at.
If the studies I linked to earlier are slightly accurate, it would seem to indicate that a high-quality Blu-ray is roughly equal in visual quality to a mediocre-to-low quality film print.
Most studies I've seen put the estimated "resolution" of 35mm far higher than that. Film is analog so there aren't lines of resolution, of course. Thus it's going to be an inexact thing, and I see some oversimplifications in the studies you link. For example the second one says they used film stocks and lenses "typical of those used for feature films" but I think there is considerable variation among lenses and film stocks.
From what I've read, I believe the consensus is that 35mm is still far exceeds 4k, but I don't have the time or inclination to dig up a bunch of links to try to back this up, sorry.
My obviously biased opinion.
All contents © Index Newspapers, LLC
Contact Info |
Production Guidelines |