NASA’s Dawn Probe Sent Some Stunning New Images of Ceres

Post 8085

NASA’s Dawn Probe Sent Some Stunning New Images of Ceres

Yesterday 10:24am

Occator Crater at a distance of 920 miles (1,480 km). (Image: NASA/JPL-Caltech/UCLA/MPS/DLR/IDA)

Swooping by at a unique angle, NASA’s Dawn space probe recently captured some of the clearest views yet of dwarf planet Ceres, including Occator Crater and its intriguing bright surface features.

Dawn’s latest orbit, its fifth since arriving at Ceres in the spring of 2015, took it to within 920 miles (1,480 km) of the dwarf planet, and at an angle where the sun’s position was different than during previous orbits. The new images are offering fresh perspectives of this remarkable object and the strange geology responsible for its distinct surface features.

Occator Crater. (Image: NASA/JPL-Caltech/UCLA/MPS/DLR/IDA)

One of the brightest areas on Ceres can be found within Occator Crater, a hole measuring 57 miles (92 km) wide and 2.5 miles (4 km) deep. Astronomers believe that the bright material at the center of this crater is made up of saltsleft behind after briny liquid seeped up from below, froze, and then sublimated, meaning it turned directly from solid ice into vapor. An asteroid impact likely triggered the upwelling of the salty liquid.

During the month of October, Dawn snapped thousands of images of Ceres at its 920-mile orbit, many of which can be seen in this new NASA gallery.

NASA’s Dawn took this image on Oct. 17 showing the limb of dwarf planet Ceres shows a section of the northern hemisphere. (Image: NASA/JPL-Caltech/UCLA/MPS/DLR/IDA)
Stunning image of Zadeni Crater on Ceres. (Image: NASA/JPL-Caltech/UCLA/MPS/DLR/IDA)
The craters Takel and Cozobi are featured in this image. Takel is the young crater with bright material on the left of this image, and Cozobi is the sharply defined crater just below center. (Image: NASA/JPL-Caltech/UCLA/MPS/DLR/IDA)
A series of linear features, which are depressions that are located within the large Yalode Crater. (Image: NASA/JPL-Caltech/UCLA/MPS/DLR/IDA)
Kupalo Crater, which measures 16 miles (26 kilometers) across and is located at southern mid-latitudes. (Image: NASA/JPL-Caltech/UCLA/MPS/DLR/IDA)

NASA scientists also released a colorized photo showing what the dwarf planet would actually look like to the human eye. With help from the German Aerospace Center in Berlin, the Dawn team took images from the probe’s initial orbit in 2015, and then calculated the way Ceres reflects different wavelengths of light. It’s not a true color image, but it’s pretty darned close.

A colorized view of Ceres based on data collected by Dawn. (Image: NASA/JPL-Caltech/UCLA/MPS/DLR/IDA)

From here, Dawn will proceed into its sixth orbit of Ceres, attaining an altitude of over 4,500 miles (7,200 km). During this phase, the probe will refine previously collected measurements, particularly those of the surface.


George is a contributing editor at Gizmodo and io9.


Two Satellites Have Spent 10 Years Staring at the Sun

Post 8084

Two Satellites Have Spent 10 Years Staring at the Sun

STEREO A and B have been vital in helping us understand our Sun, and now the mission celebrates its 10th birthday.


On October 25, 2006—ten years ago—NASA strapped two satellites onto a Delta II rocket and sent them skyward with an important mission—to figure out what our Sun is all about.

Given that it’s the brightest object in our solar system and the big ball of fire that keeps us alive, you’d think humans would know all kinds of stuff about or friend, Sol. But, of course, for most of human history we’ve been staring at the Sun from one vantage point, and that’s the soil beneath our feet. For the past decade, though, two satellites named STEREO-A and STEREO-B, part of the Solar Terrestrial Relations Observatory mission, have been traveling ahead and behind Earth’s orbit to get a more complete picture of our sun than ever before.

The results have been stunning. Here’s a brief history of STEREO and its many, many accomplishments.

STEREO B, In The Beginning

Although other missions, like ESA and NASA’s SOHO satellite, have traversed space in search of answers to our Sun-related questions, the primary goal of STEREO was to capture the first fully 3D image of our Sun, a simultaneous rendering of our life-giving star, as well mapping coronal mass ejections (CMEs) more accurately. To achieve that feat, NASA needed to coordinate the orbits of two separate satellites, one just inside and one just outside of Earth’s own orbit, to get data from three distinct areas in our solar system.

In this image, we see nearly completed Stereo B, the satellite destined to travel behind Earth’s orbit, at NASA’s Applied Physics Laboratory in Maryland in 2005. These satellites come with an extreme ultraviolet imager, two white-light coronagraphs, and a heliospheric imager—all for studying the evolution of  CMEs  as they travel from the Sun through its atmosphere (the corona).

STEREO Is Ready For Launch

This one-minute clip shows the Delta II launch from Cape Canaveral on October 25, 2006.

After the launch, scientists hailed the satellites as ushering “a new dawn for solar observation,” with plans to also accurately measure and predict solar storms. The correct prediction of these storms could help protect future astronauts from radiation when humans finally make that long, perilous trek to Mars.

Getting Into Orbit

This is an extended animation of what STEREO’s completed and planned orbit is until late 2019. By now, STEREO-A and -B have flown by each other on the other side of the Sun and are now on their return journey to Earth. You can see the satellite’s current positions right here.

This stereoscopic image of the sun, captured in April 2007, recreates what it would be like for two human eyes to stare at the Sun (albeit on a galactic scale). This data was captured during the early portion of STEREO’s mission as its twin satellites were still near Earth.

The Sun Now Comes in 3D

 This “Eye of Sauron”-like image is the exact moment when in 2011 STEREO-A and B were in the exact position to completely capture the Sun in nearly real-time. This is the first time scientists had ever seen the Sun all at once.

Strangely, it was the first time we could say without a doubt that the Sun was indeed a sphere (though we had a good idea that was the case already.)

Nope, That Weird Blue Sphere In Space Isn’t Aliens

Post 8083

Nope, That Weird Blue Sphere In Space Isn’t Aliens

Rumors swirled over the weekend that NASA was hiding something. Here’s what’s actually happening.

If you were on Facebook this weekend, you may have seen an image showing a mysterious blue sphere in space and a whole bunch of conspiracy theories about it. Aliens? NASA cover-up? As usual, the truth is much more mundane.

One of the cameras on NASA’s Sun-observing STEREO spacecraft, which normally just shows an empty field of stars, captured a giant blue sphere for a few frames before it vanished. Many people on the internet got very excited about this after UFO hunter Pamela Johnson shared the pics on Facebook. Some tabloids like the Daily Mail wrote entire articles speculating about what this giant blue sphere could mean. Theories ranged from rogue planets to holographic images to all kinds of conspiracy theories.

It’s not. Alex Young, the associate director for science at NASA’s Goddard Space Flight Center, tells PopMech that what you’re seeing is the Sun, thanks to a data processing error in which two sets of data were superimposed into a single image. “The data comes down in a big stream, and we process that data, we pluck the images out as they’re coming down in that stream, he says. “Sometimes the stream [can] get corrupted. That causes the computer program that processes those images to get them wrong.”

On the left is a normal image of the Sun taken by the EUVA camera. On the right is that same image, flipped, rotated, and layered on top of an image taken by the HI1 camera.

Onboard NASA’s STEREO spacecraft is an instrument called SECCHI. Basically, it’s a big box with five different cameras in it. Three of those cameras point directly at the Sun, while the other two are pointed at the space between the Sun and the Earth. They all observe different things, but when the data from those cameras are sent back to Earth, the images from all the cameras get bundled together in one package to save space.

Normally, the computers that receive those images are able to separate them just fine. But sometimes that data can get scrambled and the images get messed up. This means that sometimes the images show up flipped, or rotated, or jumbled in other ways. “That’s exactly what’s happening here,” Young says.

In this particular case, the images from two different cameras were layered on top of each other. An image from the EUVI camera, which looks at the Sun, got flipped, rotated, and superimposed on an image from the HI1 camera, which looks at the stars. And so, for a few frames, it looked like a giant blue sphere appeared in the middle of our solar system.

As is the case with most NASA telescopes, raw images automatically become publicly available once they’re sent back to Earth (because NASA is taxpayer-funded). The agency would remove a goofed, glitchy image like this, but it didn’t get around to it before the picture started making the rounds on the internet.