Screenshots and Resolutions

In the earlier days of the Vs. Debate, the posting of screenshots was virtually non-existent, unless you were just posting a generic shot of something you scanned out of a magazine.  For starters, there wasn't the ease of having things on DVD . . . ST:TNG was the first bit of Trek or Wars on DVD, and that was 2002.   That meant there was no simple act of popping a DVD into the computer and making screenshots to your heart's content, and of course in the early days there was no free image hosting service on the web like you find today.  

In order to post a screenshot, you had to have the following:

1.  A video cassette recorder (VCR) . . . this would enable you to record the program from the television broadcast and thus have it to work with.   You could hardly hope to capture a particular image when the show was being aired, after all.   Further, it had to be one of the ones that would allow for good frame-by-frame, which was no small feat for cheaper VCRs . . . you'd frequently end up with jumpy, warped, and/or staticy images.  Of course, even at the best recording rate you were losing information, but we're getting a bit ahead of ourselves.

2.  You'd also need a piece of equipment that would hook your computer to your VCR and allow you to digitize the analog magnetically-recorded images off of a VHS tape.   These got cheaper around the time of the advent of DVD, but were still the sort of thing you wouldn't go out and buy just to win a debate with a bunch of geeks.   And, of course, unless you wanted to move your VCR back and forth and engage in the various disconnections and reconnections, it would help to have another VCR right by the computer, adding to the pricetag.

3.  Assuming you had a suitable VCR with maximum-quality recorded tapes of your local station's broadcast of Trek or Wars whenever they might've been shown, and the digitizing device hooked up to your machine, then you now had the maximum quality screenshot available.   Of course, unless you e-mailed it to everyone then you were screwed, so you also had to have some webspace available to you.   Students and some company employees might've gotten a few megabytes for free, but if you really wanted to make a festival of screenshots you'd probably have to pay for the privilege.  Webhosting is now cheap enough that almost any schmoe can afford a site, but back in the day this wasn't necessarily the case.   So, if you wanted to be hardcore with your Vs. Debate activities, you'd shell out some bling (before it was even known as such) and get several megabytes of hosting for quite a few dollars per month.  

All this is simply to point out that it was a time-consuming pain in the butt to post screenshots, and in order to get to that point you had to shell out some time and dough.   While we're not talking massive time and dough, it was enough that you really, really had to be concerned about the Vs. Debate to even consider it.

(If you were lucky, of course, you could bypass all this by having a magazine that showed a picture of whatever you wanted a scan of.   This is why older sites and older arguments often focused on scans of magazines, trading cards, and other sources of images that would seem somewhat non-obvious to us today.)

Of course, even if you went all-out you still had a limited resource.  In the below, we'll also ponder what the best possible resources are.

Image Resolution and Interlacing

Televisions operate via scan lines . . . in modern computer lingo, this would translate as the horizontal resolution.   For instance, how about I post an image that's 318 pixels tall:

Each one of those -horizontal- lines of pixels . . . a horizontal line of dots like those of ellipses . . . would, in simplified TV lingo, be a scan line. 

Of course, in order to make full-motion video you had to transmit several pictures (frames) per second.   The human brain, happily, is generally capable of filling in enough of the gaps so that you don't have to show gajillions of frames per second . . . film, for instance, uses a mere 24 frames per second.   

But film actually shows you whole pictures for each frame, which is something television doesn't do.   The fact that TV doesn't do it was actually a technical necessity, really.   The early televisions could easily have shown 24 frames in a second, but this involved drawing in each horizontal line all the way down the screen.  Thanks to the rapid decay of the brightness of the scan lines, you'd end up with a most annoying flicker.  Back in the day, some bright spark got the idea that instead of trying to make the television draw all the horizontal lines at once, they could achieve a similar effect by flitting back and forth between half the lines.   In other words, the TV wouldn't show a whole picture like the one above, but would show a picture where every second scan line was dark . . . something like this:

Of course, this only took half the time for the picture tube to draw, since it was only every other line.   And thus the TV could then come back and draw the other half of the lines, doing so quickly enough that our brains wouldn't notice a thing.   This interlacing of the scan lines wouldn't look as good as the progressive display of whole frames, but it was a workable solution for the limited tech of the day.

Interlaced vs. Progressive Scan

As noted, film shows whole pictures.  It does so by a moving 'tape' of film frames that get a very bright light briefly flashed on them as they rush past the aperture on the projector.  Earlier, I also mentioned that the reason why interlacing became the standard is because of a technological issue.   Well, that issue isn't really an issue anymore.

While any sort of television still involves scan lines, it is now entirely possible to place a frame on the screen every required fraction of a second without having to worry about the flicker that an old television would've suffered.  In other words, instead of displaying every second scan line to make a field and then going back and doing it again to make another field, most modern television sets would be more than capable of drawing a whole frame at once, and then going back to do it again, without any sort of flickering.   This drawing of complete, non-interlaced frames is called progressive scan.

As a result of interlacing, the number of frames per second would 'really' be doubled, but each frame would only be half of a picture.   These wonky half-frames came to be known as fields.   Technically, then, interlaced video doesn't have frames . . . it has fields, and if you are working with a frame you've captured you're really just working with two fields.  After all, since television cameras were designed with interlacing in mind, the cameras scanned in an interlaced format, as well.  This is a concept entirely foreign to many people . . . we think in frames per second, and would assume that a video camera was simply snapping X frames in one second.   In reality, this wasn't so.

Thus, in a moving scene each field was separated by a fraction of a second, meaning that a combined image could show some peculiar effects on fast-moving objects.   In the image to the right, for instance, we have the Q fireball warping toward the camera in "Encounter at Farpoint"[TNG1].   The ghostly image of a second, larger fireball is more than just motion blur . . . it's an artifact of the interlacing.   I commonly avoid the use of such frames where possible purely for aesthetic reasons, but they could be used in just the same way as we count frames per second. 


So how many fields per second are we talking about here?  Well, that depends.

North American television settled on the NTSC standard.  Image transmission would occur at 30 frames per second (or, really, 60 fields per second).   (Unfortunately there was a minor glitch found with this in the early color TV era, and thus the framerate was bumped down a smidgen to 29.97fps.)   Transmissions featured 525 actual scan lines, though with some of these budgeted for non-visual detail only 480 were theoretically visible.  Given the 4:3 aspect ratio, the maximum resolution was 640 x 480, or about 0.3 megapixels.   In practice, though, broadcast TV only featured about 330 pixels per horizontal scanline.   That's 330 x 480, or almost .16 megapixels.

European standards settled toward PAL.   They ended up with a system of slightly higher resolution but with a smaller number of frames per second . . . of 625 horizontal scan lines per frame, 576 were theoretically visible . . . almost 100 more than with NTSC.  However, the framerate was only 25 frames per second.  This was due in part to electrical system differences . . . in North America alternating current at 60Hz was used, whereas European current only alternated at 50Hz.   Due to various technical reasons it was best to have the AC and framerate more or less synced.   This lower refresh rate meant that flicker might be more noticable, but I'd say the higher resolution was probably worth it.

(These, incidentally, are the reasons that video shot with European cameras looks weird after being converted for American televisions.  The reverse is probably also true, though I haven't been to Europe to check.) 

VHS, LaserDisc, and DVD

I mentioned earlier that even if you went all-out with your capture equipment, you still had a limited resource.  By this, I referred to the limitations of VHS.  Even at the maximum resolution obtained by selecting SP mode, a VHS tape was only giving you 220 to 240 pixels per horizontal scanline, or about a third less than what you got from broadcast TV.   The brief life of the Laserdisc format saw an improvement to over 400 pixels per horizontal scan line . . . better than broadcast television.  Indeed, Laserdisc versions of the Star Wars films are still the best versions of the pre-SE films around.

However, it was the DVD format that brought real quality to the equation.   Introduced in 1997 in the U.S. and with the players more widely affordable by 1999, DVDs store compressed video at a resolution of up to 720 pixels per 480 horizontal scan lines, or 720 x 480.  Though true videophiles grimace at the MPEG-2 compression of the video, the fact remains that it's a far better resolution format than we've had before.

Of course, that resolution wouldn't have worked in PAL countries . . . their DVDs are 576 pixels deep, though still only 720 wide.  How this is achieved varies . . . if it's something that was shot on a simple video camera in an NTSC country, then the video was shot at 480 lines and is artificially up-converted.   There's no actual enhancement of the resolution since the original source images were only at 480 lines.   Further, fields get discarded in order to bring the rate from 60 fields per second down to 50. 

If, however, the original source material is a film, then this isn't a bad thing at all.  In PAL countries, films are shown at the PAL framerate of 25fps with no major changes, meaning that they end up playing four percent faster than the normal film speed of 24fps.   Also, it's usually true that films meant for PAL distribution will be directly converted to the PAL standard from the original film, meaning that PAL countries get a higher-resolution version of the film than NTSC countries.

Another issue is with the interlacing.   In general, DVD video content is designed for interlaced displays.   However, there are progressive scan DVD players which are capable of taking interlaced content and converting it to progressive mode.   For television shows originally shot in interlaced video this might not be of much value, but for a film that's been converted to interlaced format this reversing of the conversion can serve to digitally restore the original frames, for the most part, giving you the highest quality images for each and every frame. 

Last but not least, there's the issue of anamorphic widescreen DVDs.   These are optimized for the next-generation televisions which will have a 16x9 aspect ratio instead of the current 4x3.    What this means for us is that an anamorphic widescreen DVD  doesn't waste as much possible resolution as might otherwise be wasted due to the black bars of a widescreen DVD, but there's still going to be some loss.  For more on the matter look here.

HDTV and HD-DVD / Blu-Ray

Whereas previous optical disc formats (like Laserdisc and DVD) easily outstripped the quality of broadcast television, a new standard in television has put it back at the top of the heap, for now.   High-Definition Television (HDTV) is that standard.   Capable of up to 1080 lines of resolution, HDTV is a vast improvement over the older NTSC sets that most people use.  Further, HDTV can take advantage of the better technology and show progressive frames instead of interlaced fields.

Unfortunately there's some confusion with HDTV right now, since the people who wrote the standards have multiple standards for it.   One is the 720p standard . . . this means that only 720 scan lines are employed, but these are shown progressively.   The alternative is 1080i, which features 1080 scan lines but in an interlaced format.  Contrary to what you might think, 1080i is not really a huge improvement over 720p, thanks to the interlacing.   And as if that weren't enough, there's also what's being called "HD Lite", primarily used in broadcasting, which features the ad-worthy 1080 interlaced lines, but instead of the 1920 horizontal pixels you might expect you only get 1440 or 1280, which are then stretched out on your screen (much as occurred with some old or low-quality 480i television broadcasting).

Happily a 1080p standard is gaining strength, and hopefully it will eventually take over.

For now, however, there is not yet an optical disc capable of displaying HDTV content.   This means that even if you go hit the nearest electronics store and buy an HDTV capable of 1080p, you're still only getting 480p (best case) out of your DVD.  Various artificial means can be used to up-convert the image, but these don't necessarily represent what's really there.

This will change within the next few years.  Much as was seen with the VHS vs. Beta war in the 1980s, there are two competing standards for the next generation optical disc, one capable of showing HDTV content.   Called HD-DVD and Blu-Ray respectively, only time will tell which comes out on top.  (Sony's still pissed from losing the Beta war, though, so my money's on them.)   20th Century Fox (Lucasfilm's distributors) and Paramount are both supporting the Blu-Ray standard.   Since Blu-Ray is the only one with native support for 1080p, the extreme videophiles might want to root for either Blu-Ray or at least a long-term struggle for supremacy.

Unfortunately, the PAL vs. NTSC issue will still have some effect even with HDTV.   You won't just have 1080p . . . former NTSC countries will have 1080p60, whereas PAL countries will show 1080p50.   The extra two digits on the end, as you might've guessed, refer to the refresh rate.

(Incidentally, there's also "EDTV" on the market right now . . . this is simply a digital television capable of showing 480p instead of TV's normal 480i.  It helps with DVDs, but it's not a vast improvement like HDTV.)

But What About Film?

Film has no particular resolution, really, any more than a painting does.  It all depends on how much information can be eked out of a particular image.   The act of transferring film images to an interlaced medium (for broadcasting to television or (frequently) placing the film on a DVD) does involve resolution.   This is called telecining, and is done via a film scanning device.  These scanning devices can also be used to simply scan the progressive frames for later digital manipulation.   In either case, though, we're talking about recording a picture at a certain resolution.

So what's the maximum resolution we can grab off of the most common 35mm film?   That's a subject of some debate.  It's claimed by some that up to 10,000 useful horizontal lines can be scanned in from 35mm film . . . this would be almost ten times the line resolution of HDTV.   Meanwhile, others claim that the new high-definition digital motion picture cameras with a resolution of 1080p24 actually beat 35mm film . . . this would basically mean that 1080p60 HDTV is as good as film, and faster, too. 

Personally, I find the notion that HDTV on a 30 foot screen would be better than 35mm film to be unlikely, unless the 35mm film stock is cheap or old.  This is especially true given that some of the new digital projectors for theaters are already at 1536p.  But, perhaps the safest thing to say is that film and HDTV are within the neighborhood of one another resolution-wise, depending on various factors.   

Best of Show:  Putting it All Together

Nowadays, just about any computer will come with a DVD player, and most DVD-playing programs on the computer will give you the option of capturing an image from the DVD.   And, of course, one can get webspace or picture-storing space for little to nothing and with little effort.   As a result, most everybody can post screenshots all day long.

But are they giving the very best?   Well, that depends.  It's possible, of course, to up-convert just about anything, but in my opinion up-conversion is an artificial improvement when it comes to canon analysis . . . it's like adding words to a too-short book that weren't there before.  On the other hand, an image that's simply been blown up (say, a 10 pixel by 10 pixel part of the screen that someone has simply resized to 100px by 100px) isn't looking at what wasn't there . . . it's looking at what is there and actually being able to see it.  But, if something was filmed with simple 480i video, then up-conversion to HDTV might look better on an HDTV set but, if you're counting pixels, you're not looking at the image in its native resolution . . . you're looking at things that were never actually there.

Let's start with Star Trek: The Original Series.   It was shot on film with the special effects work done via film opticals, usually aiming for television quality (and generally hitting it).  Since the show was shot in the US and is thus NTSC, the best we can hope for in order to get the maximum resolution is 480i.  Any old DVD player and analog television will give you that.   PAL viewers are watching up-converted versions, meaning that they're gaining almost 100 artificially-interpolated scan lines but are losing ten fields (five half-frames) per second.   There's little point to buying the show on the next generation optical disc format.

   Format Info:

Native Film for 480i
Best Consumer Recording (World): 480i DVD
Best Consumer Recording (U.S.): 480i DVD
Best Consumer Recording (Next-Gen): (none)

Now let's look at the original Star Wars trilogy.  Shot and with special effects done entirely on film, the native resolution is going to be quite good.  Even the new CGI effects were good enough for film viewing, so there's no problem there.  For the purposes of DVD, then, the best resolution is going to come from a PAL source, though of course this will play four percent faster than the original film.  Assuming separate transfers were done for the widescreen and the fullscreen, then there will be certain portions of the films (i.e. those visible in the fullscreen pan & scan) which will be visible at a full 576 lines.  Otherwise, you'll be losing some resolution thanks to the necessary black bars of widescreen.  (My widescreen trilogy set is in anamorphic widescreen, however, meaning that the loss is minimized provided that you can keep your player from down-converting the image).   So, for the best version of the films on DVD you'll want both PAL versions (wide and full) played deinterlaced via a progressive scan DVD player, thereby achieving 576p.  North American viewers will be limited to 480p.  The next generation high-density optical discs will be a profound improvement, giving all of us 1080p.

Also noteworthy here are the recent broadcasts of the Star Wars films in HD.   This has occurred on Sky HD in the UK, Cinemax HD in the United States, and elsewhere.  While the precise broadcasts have varied (primarily in regards to data bitrate differences and codec variations . . . h.264 in the UK versus MPEG-2 compression in the US), all are apparently derived from the original HD-quality transfers of the films that were used to master the DVDs.   Thus, if one has access to these then one has access to the most detailed views of the Star Wars films.  But naturally, there's a catch . . . since the films were broadcast in 1080i, then one is stuck with interlacing. 

   SW OT (Original Versions)
   Format Info:

Native 35mm Film
Best Consumer Recording (World): 576i PAL Laserdisc
Best Consumer Recording (U.S.): 480i NTSC Laserdisc
Best Consumer Recording (Next-Gen): (none)

   Format Info:

Native 35mm Film
Best Consumer Recording (World): 576i PAL DVD (576p with Prog. Scan)
 1080i HD Broadcast (Sky HD et al.)
Best Consumer Recording (U.S.): 480i NTSC DVD (480p w/ Prog. Scan)
 1080i HD Broadcast (Cinemax HD 2007)
Best Consumer Recording (Next-Gen): 1080p Blu-Ray

The same should be true of the Star Trek films, assuming no shortcuts were taken for the DVDs.   For instance, the new version of ST:TMP had some new CGI effects done for it, but I'm not sure what the resolution of those effect might have been.   It's possible that they did simple video-quality (480i) effects and then up-converted them for the PAL DVDs, for instance, though as I recall there was a limited theatrical release which would hopefull mean that sort of shortcut couldn't have been performed.   Assuming no such shortcuts occurred, then for the best quality you'd want both PAL versions (if applicable) played deinterlaced via a progressive scan DVD player, thereby achieving 576p.  North American viewers will be limited to 480p.  The next generation high-density optical discs will be a profound improvement, giving all of us 1080p.

   ST Films
   Format Info:

Native 35mm Film
Best Consumer Recording (World): 576i PAL DVD (576p Prog.)
Best Consumer Recording (U.S.): 480i NTSC DVD (480p Prog.)
Best Consumer Recording (Next-Gen): 1080p Blu-Ray

With TNG, DS9, and Voyager, things are a little confusing.   These were all shot on 35mm film and then transferred to an interlaced analog signal (480i) (a process called telecining) for broadcast.   Further, the special effects (at least during TNG) were largely done on simple 480i video, though the model shots and whatnot were also shot on film.   These would be telecined to 480i video and composited, with certain effects like phaser beams and torpedoes done entirely via video.  Certain other effects like computer-generated viewscreen images by Okuda and company would be done in quite high resolution before transfer to video.  So, in theory, you could get some reconstruction of this film-quality imagery via a 480p player, but certain FX elements were natively 480i.   I'd say the safest course to ensure the closest-to-native quality would be simple 480i, just as with the Original Series.   PAL viewers are watching up-converted (non-native) versions, meaning that they're gaining almost 100 artificially-interpolated scan lines but are losing ten fields (five half-frames) per second.  There's little point to buying the show on the next-generation optical discs.

   Format Info:

Native 480i / 35mm film down-conversion
Best Consumer Recording (World): 480i NTSC DVD
Best Consumer Recording (U.S.): 480i NTSC DVD
Best Consumer Recording (Next-Gen): (none)

For the Star Wars prequel trilogy, we actually need to split up the films individually since their native resolutions are so different, though the end result is the same as for any other films.  In all cases, your best bet is to treat this like the original trilogy DVDs . . . go for PAL and hunger for HD-DVD at 1080p.

In the case of Episode I, we have a movie shot on film and converted to a digital format for all the zillions of effects, at which point it was put back on film.   The resolution of this digital conversion and effects work is not known to me, but presumably it was at HDTV-res or higher.  In the case of Episode II, we have a movie shot entirely on an early digital camera with a native resolution of 1440 x 1080p24 compressed progressive-scan video.  Episode III is only a little different.   Filmed with a more advanced digital camera, Episode III's native resolution was of an uncompressed digital progressive video quality, or 1920 x 1080p24.   Reportedly, however, filming in full 2.35:1 aspect ratio (really wide widescreen like Episode III was filmed in) results in a lessened resolution, since the camera maxes out at 1920 by 1080 (which is an aspect ratio of 1.78:1).   This would mean that Episode III (and perhaps Episode II) only featured 817 lines of resolution natively, meaning anything above 817p is going to be more than was there to begin with.  Hopefully this information is wrong and some sort of anamorphic lens or post-production processing was employed to give more than this figure by design.  Further information on this is desired.

Also noteworthy here are the recent broadcasts of the Star Wars films in HD.   This has occurred on Sky HD in the UK, Cinemax HD in the United States, and elsewhere.  While the precise broadcasts have varied (primarily in regards to data bitrate differences and codec variations . . . h.264 in the UK versus MPEG-2 compression in the US), all are apparently derived from the original HD-quality transfers of the films that were used to master the DVDs.  Thus, if one has access to these then one has access to the most detailed views of the Star Wars films.  But naturally, there's a catch . . . since the films were broadcast in 1080i, then one is stuck with interlacing.

   SW Prequels
   Format Info:

Native 35mm Film / 1080p24 (poss. 817p!)
Best Consumer Recording (World): 576i PAL DVD (576p Prog.)
 1080i HD Broadcast (Sky HD et al.)
Best Consumer Recording (U.S.): 480i NTSC DVD (480p Prog.)
 1080i HD Broadcast (Cinemax HD)
Best Consumer Recording (Next-Gen): 1080p Blu-Ray
(or perhaps 720p?)

Finally, we come to Star Trek: Enterprise.  All four seasons were broadcast in 16:9 widescreen, and the DVDs are in anamorphic widescreen.  This means that whereas an older 4:3 NTSC television will have black bars blanking out many of the 480 lines, one of the newer 16:9 televisions will get the full 852x480i, though this will be compressed on the DVD to 720x480i.  Thus, for full quality an NTSC-type interlaced DVD with anamorphic widescreen is the way to go.   PAL up-conversion and high-density optical discs won't help.

For the third and fourth season of Enterprise, though, the broadcast wasn't in 480i but in 1080i HDTV.  This means that the DVD sets are actually featuring far less resolution than the original broadcast.   Many fans are thus also downloading high-quality versions online ripped from the original 1080i broadcasts, from extremely clean down-conversions to ~480i to some episodes in 720p format.  I'm not sure if the PAL versions of the last couple of seasons of Enterprise are properly-done transfers of the originals straight to PAL or if they are upconverted from 480i.  In either case, though, until the emergence of high-density optical discs, these few 720p HDTV-rips will be the best available versions.

   Format Info:

S1, S2
S3, S4

480i 16:9 (35mm Film down-conv.)
1080i60 16:9
Best Consumer Recording (World):
S1, S2
S3, S4

480i NTSC ana. DVD
576i PAL ana. DVD
Best Consumer Recording (U.S.):
S1, S2
S3, S4

480i NTSC ana. DVD
480i NTSC ana. DVD
Best Consumer Recording (Next-Gen):
S1, S2
S3, S4

1080i NTSC (HD-DVD / Blu-Ray)

Formats and Canon

In the case of Star Wars, the HD broadcasts . . . a tease of what we can expect in the Blu-Ray era . . . offer the best picture quality, though for frame-by-frame timing the DVDs will be superior.  While it should be possible to deinterlace the 1080i broadcasts, the likely bitrates used for broadcast will still be less than what one could expect on a Blu-Ray disc.  But still, as these are genuine transfers of the Star Wars films and not mere upconversions, these offer us the best presently-available views of the movies.

At some point, it might be worthwhile to ponder whether, say, an up-converted version of TNG that's been processed to look great at 1080i is in fact the canon version of the show so long as it's released by Paramount.   After all, there would be all-new pixels for us to zoom in on.  Technically the answer is probably yes, though from the perspective of whether we were looking at information which was actually present to begin with the answer is probably no.

Of course, if this sort of question becomes a major issue once the next generation optical discs come out then we're probably looking at this stuff way the hell too closely for our own good . . .

 . . . not that it would be the first time.  ;)

More information can be found all over the internet.  I recommend this site for further reading.

Special thanks to pro-SW debater "Mange the Swede" for complaining about some of my "low quality" DVD-resolution images, thereby making the need for this page readily apparent.