2017年8月28日 星期一

Voyager 40th Anniversary: Watching an Alien World Turn

Ian ReganAugust 28, 2017

Voyager 40th Anniversary: Watching an Alien World Turn

In 1979, two robotic emissaries, conceived of and built by humans, trained their electronic eyes upon a giant alien planet and its coterie of moons. The images radioed back to Earth by these now iconic spacecraft have been printed and reprinted throughout the 38 years since. But due to limited processing power, the quality of the original data was never fully conveyed in the color composites that flashed up in newscasts, adorned glossy pages in magazines, and hung as posters in the bedrooms of space-obsessed youngsters with starry-eyed ambitions. Few people (except, maybe, regular readers of this website) know that all the original data are available to the public, just waiting to be reworked with modern image processing techniques.

In this context, I set my goal to restore the time-lapse movies that both Voyager space probes captured as they edged closer to rendezvous with Jupiter in 1979. In these breathtaking sequences, the giant world rotates before us. Cloud bands and the famous Great Red Spot appear from the gloom, only to swiftly disappear over the horizon. The satellites first spotted by Marius and Galileo whip around in a Newtonian dance, casting fuzzy shadows upon the cloud decks. The Voyager 1 movie may be familiar to most watchers of science documentaries, but it has not before been seen in a high-definition, high-fidelity format of quality matching the source data.

Between 18:31 hours (UTC) on January 30 and 19:27 hours (UTC) on February 3, Voyager 1 took 3,636 NAC frames, shuttered through a repeating sequence of orange, green, and blue filters. Due to intermittent problems with the downlink between the spacecraft and the dishes of the Deep Space Network (DSN), more than a hundred of those images were lost forever, creating some gaps in the final rotation movie. Gaps are visible as noticeable jumps in the restored time-lapse sequence. The three most proximal Galilean moons all make cameo appearances, most notably a majestic transit of Ganymede, followed by its shadow a short time later. The low phase angle and equatorial vantage point provided ideal conditions for this historic observation. The movement seen in this video is a smoothed and averaged-out representation of the attitude of Voyager's scan platform as it took this sequence.

NASA / JPL-Caltech / Ian Regan

The Voyager 2 movie is shorter and has been seen less often.

Between 20:34 hours (UTC) on May 27 and 22:34 hours (UTC) on May 29, Voyager 2 took 1,251 NAC frames, shuttered through a repeating sequence of orange, green, and violet filters. Only three didn't make it to Earth, but about 40 suffered significant data loss. This sequence covered five Jovian days. Unlike its sister craft, Voyager 2’s inbound trajectory was noticeably inclined to the Jovian equator; indeed, the sub-spacecraft latitude was approximately 8 degrees north. The phase angle, at nearly 38 degrees, is higher than Voyager 1’s perspective, so Jupiter appears in a gibbous phase. The movement of Jupiter in this video is a smoothed and averaged-out representation of the attitude of Voyager's scan platform as it took this sequence.

NASA / JPL-Caltech / Ian Regan

Finally, it's interesting to look at a bit of the two videos side by side, to see how atmospheric features shifted from one encounter to the next.

A side-by-side comparison of two Jovian revolutions filmed by the twin Voyager spacecraft, created in honor of the 40th anniversary of the Voyager launches. Note how the storms and other features changed positions in the four months between encounters.

NASA / JPL-Caltech / Ian Regan

Following is an explanation of the source data and how I produced these new movies.

The Observatory Phase

Two months prior to their respective closest approaches, both Voyagers entered the so-called ‘Observatory Phase’, during which the suite of remote-sensing instruments onboard each craft engaged in near-constant monitoring of the Jovian system. Although the resolution of these observations would be much lower than those captured near encounter, they would allow scientists to watch how atmospheric features evolved over time.

A variety of narrow-angle camera image sequences was scheduled for the two-month observatory phase, including several time-lapse movies. Early movies captured a set of images in 4 filters every two hours, thereby documenting the change of atmospheric features at regular intervals of 72 degrees of longitude. The most famous product of this observation, a clip made of selected images at the longitude of the Great Red Spot, was restored only a few years ago by long-time Planetary Society contributor Björn Jónsson:

VIDEO

This movie is based on 58 orange-green-blue color composites obtained on every Jovian rotation from January 6 to January 29, 1979. Over this period Voyager 1's distance from Jupiter dropped from 58 to 36 million km, so the resolution and sharpness of the frames increases from start to finish. The 58 frames were tweened, increasing the number of frames by a factor of 8 (that is, 7 synthetic frames are inserted between each real frame).

NASA / JPL-Caltech / Processed by Björn Jónsson

Close to the end of the pre-Jupiter encounter observatory phase, the imaging strategy changed. Both spacecraft began to take images at very short intervals (2 to 3 minutes), as the planet rotated beneath the spacecraft. The Voyager 1 movie lasted approximately 100 hours (ten Jovian days) and the Voyager 2 movie 50 hours (or five Jovian days).

Both movies were in color. Voyager's television cameras resembled those used in television studios of the day, with some minor differences. In order to produce a full-color image, Voyager would take a sequence of three frames through a selection of interference filters mounted on a wheel in front of the optics. The movie frames were taken through a repeating sequence of interference filters: orange, green, blue for Voyager 1 (OGB), and orange, green, violet for Voyager 2 (OGV). Usually, you would expect color images to be made with red, green, and blue (RGB) filters. But the Voyager cameras didn't actually have red filters, because their cameras were relatively insensitive to red wavelengths. The longest-wavelength filter in their arsenal was orange. Pictures taken through the orange filter required exposure times double that of the corresponding green-filter images, because of the cameras' low sensitivity.

The Data Set

The Voyager cameras were mounted on scan platforms for precision pointing, but there was a human limit to how many times the pointing could be updated through the long movie sequence. As a result, the images from Voyager are not all centered on Jupiter; the planet drifts from frame to frame, sometimes even dropping partially off the frame. Stabilizing the frames was a crucial part of my workflow.

Another challenge was that the cameras are affected by geometric distortion, but the amount of distortion was not constant. It depended on the amount of charge accumulating on the detector. To deal with the variability, engineers painted reseau marks on the optics to map the distortion. Without correcting for the warping, the Jupiter approach movies look as though they were filmed from the bottom of a swimming pool and through a dirty lens. (I speak here from bitter experience: I had to abandon a 2011 attempt at compiling the Voyager 1 movie for this reason.)

Fortunately for this effort, the Ring-Moon Systems Node of the Planetary Data System produced an improved version of the data set. In 2012, they released versions of all the Jupiter encounter images from both Voyagers that had been subjected to a rigorous and peer-reviewed calibration and geometric rectification algorithm. They also filled in the black blemishes of the reseaux with a gray color calculated as the median of surrounding pixel values.

The original images consist of 800 lines of 800 samples each (a fancy way of saying 800 x 800). However, the geometrically corrected versions were resampled at a size of 1,000 x 1,000 (or one megapixel) to preserve fine detail. While the Voyager cameras are considered clunky by today’s standards, they were capable of taking very sharp images (Voyager 2’s cameras in particular).

Processing the Voyager ISS images

NASA / JPL-Caltech / Ian Regan

Processing the Voyager ISS images

Processing the Movies

Having downloaded geometrically corrected and calibrated image files via the OPUS PDS tool, I used Björn Jónsson’s excellent IMG2PNG utility to convert the native IMG files to a more convenient and friendly PNG format. I kept all data at 16 bits per channel throughout the workflow.

Using a tried-and-tested suite of image processing software (Corel Paint Shop Pro X6, ImageJ, and Affinity for Windows), I broke down the gargantuan task ahead of me into a series of steps:

(1) I ensured the disc of the planet was centered in each frame. For best alignment, I used the high-contrast limb to measure the position of Jupiter with respect to the top and right margins. Typically, this was executed at 400% scale, thereby reducing imprecision to one quarter of a pixel.

(2) I measured the size of the Jovian disc in each frame, using an automated Python script with output funneled into a TXT file.

(3) Using Celestia, I created a pseudo-Lambertian ‘illumination mask’, to model the phase of Jupiter as seen by Voyager, capturing the way the solar illumination varies across the planet.

(4) Employing the measured disc sizes in Step 2, I created another script which automatically applied the "illumination mask" to all images in the series, thereby removing the shading. In effect, this rendered all points on Jupiter as is if they were lit by the equivalent sunlight of local noon.

(5) I created RGB composites from these de-shaded frames, using OGV for the Voyager 1 images and OGB for Voyager 2. Every monochrome frame was turned into a color composite by adding the previous and subsequent frames in the sequence. For example, frame number C1549134 was shuttered through Voyager 1’s blue filter; to turn this into a color shot, I imported the previous green (C1549132) and subsequent orange (C1549136) frames.

(6) I applied a predefined mesh warp to correct for frame-to-frame rotation of the quickly spinning planet. For this to work, the automated script needed to know the "base" filter for a given frame and the apparent size of Jupiter in pixels. In the example given above — C1549134  — "blue" was the base filter, meaning that the script had to apply a forward warp to the red(orange) channel, and a backward warp to the blue(violet/blue) channel. This clip from 2010 outlines a rudimentary version of the process.

(7) Color processing: the combined O-G-B or O-G-V frames produced a garish, greenish Jupiter. I corrected the color (subjectively) to render the planet as the human eye might see it, and the resulting movies and stills happily resemble the Hubble, Cassini, and Juno composites of the giant planet.

(8) After applying a series of sharpening and high-pass filters, I re-imposed the global shading using an inverse of the "illumination mask" I described in Step 4.

The process described above would complete the work if all the frames were whole. But there were partial frames, frames with moons, shifting moon shadows, and missing frames.

Fixing Damaged or Missing Frames

For damaged/missing frames, I found that the mesh warping method could produce very satisfactory synthetic frames. As for the moons, I initially charted the orbital motions of the satellites in Excel, recording a handful of X and Y values and employing cubic splines to interpolate all other positions. After they were erased from the processed Jupiter images using data patched in from synthetic frames, I re-instated color composites of the moons right at the end of the workflow, using my Excel data and a Python script within Paint Shop Pro to do all this semi-automatically. This worked very well, producing smoother orbital motions of the moons than manual, frame-by-frame assembly could achieve.

As for the shadows cast by the moons upon Jupiter, I didn't find any better solution than isolating the shadows from the monochrome frames and inserting them into the final color composites.

The Scan Platform: Injecting Voyager back Into the Picture

I made one version of the restored movies have had the scan platform’s pitch and yaw motions reintroduced. Why? Well, I felt that this would enhance the videos, giving the viewer a visceral sensation of being onboard a functioning and operating spacecraft.

I used the VirtualDub freeware video app, along with the similarly freely available Deshaker plugin. Ironically, I used Deshaker for the *opposite* of its intended usage! I ran the plugin on the inverted, raw Voyager frames, measuring the changing attitude of the scan platform as it shuttered away. 

Importing the data into Excel, I greatly smoothed-out the x and y values, using a mix of ‘mean’ and ‘median’ functions to arrive at a compromise between faithfulness and viewability. Reinstating the ‘actual’ scan platform motions would result in an unwatchable—or at best, nausea-inducing—video!

To see how much of an improvement this version of the movie is over the old, here is the version of the Voyager 1 movie most often shown in science documentaries:

VIDEO

Part of Voyager 1's Jupiter rotation movie, taken by Voyager in 1979, as commonly shown in contemporary (and even recent) science documentaries. It includes one and a half Jupiter rotations.

NASA / JPL-Caltech

MER
Let's Change the World

Become a member of The Planetary Society and together we will create the future of space exploration.

Join Today

LightSail
LightSail

LightSail 2 will launch aboard the SpaceX Falcon Heavy. Be part of this epic point in space exploration history!

Donate



from Planetary Society Blog http://ift.tt/2wis2g2
via IFTTT

沒有留言:

張貼留言