The new Dyson 360 Eye does a great job picking up dirt—but has trouble reaching it.
Robotics and Automation News (press release) (registration) | Rapid growth of online orders welcomes robotics technology, says Axium Robotics and Automation News (press release) (registration) Automated guided vehicles (AGVs), looking like driverless forklifts, carry pallets to and from transport trailers; there are articulated robot arms that de-palletize and palletize goods; autonomous shuttles bring goods to and from their shelves, while ... |

You'd think that the first robot vacuum from a company like Dyson, who reinvented the vacuum, fan, and hair dryer, would rival R2-D2 when it came to functionality. But with the 360 Eye, Dyson instead focused on creating a robovac that did one thing very well: cleaning. It delivers as promised, but is that worth $1,000?
The no-frills approach to its robot vacuum is surprising when you consider that Dyson has actually been developing its robovac for close to 18 years now. Before the Eye 360, Dyson created the DC06 which, until recently, has only existed in a handful of leaked photos outside the company.

It cleaned well, but the DC06's size, weight, less-than-amazing battery life, and price tag didn't quite meet the company's expectations. As a result, the DC06 was scrapped, the five working models the company created went into exile, and Dyson's robotics division then spent the next 12 years developing the 360 Eye instead.

As far as form factor goes, small and tall is the best way to describe the 360 Eye. Compared to the Samsung POWERbot VR9000, which could easily play a droid in Star Wars, the 360 Eye looks like a tiny can of cookies. Of all the consumer-level robot vacuums currently on the market, the 360Eye has the smallest footprint, by a longshot, but it also comes at the cost of it being a little on the tall side.

Life is all about trade-offs, and Dyson's engineers decided that being able to squeeze into the small gaps in-between your furniture was more important than being able to squeeze under your couch. As a result, the 360 Eye didn't even come close to fitting under my Ikea couch, but neither could Samsung's POWERbot VR9000, nor a Roomba. I even have trouble squeezing a mop under there, so I feel Dyson's engineers made the right decision by focusing on keeping the 360 Eye's footprint as small as possible.

Instead it allowed the robovac to squeeze into tight areas that I assumed would always have to be cleaned by hand. Will the 360 Eye be able to clean every hard to reach area in your home? No. You'll still need to have a manual vacuum on hand to ensure every last inch of your floors get cleaned. But it should at least be able to autonomously clean the most visible areas, so your friends don't think you're a complete slob.
The 360 Eye's design continues Dyson's unintentional approach of creating appliances that look like science fiction props, with its silvery faux-metal plastic housing and bulging 0.33-liter dust bin on the front. But other than a large button on top that lights up with various patterns to signal what the 360 Eye is currently doing or what it needs (charging, connecting to your Wi-fi network, cleaning, etc.), the only real distinguishing feature atop the robovac is an ominous-looking dome that gives the bot its name.

That dome is a 360-degree camera (looking eerily like HAL 9000's unblinking eye) that feeds a wraparound image of a room to the 360 Eye's processor. You might assume the panoramic camera on top photographs a room's ceiling so the robot can plot its course. But that's not how it works.
The 360 Eye takes a simpler approach to cleaning. Once the robot starts vacuuming it sticks to a five-meter square section of a room that it cleans by spiraling out from the center. Then it moves onto a neighboring square, and so forth, until a room is clean. This makes for more efficient use of its 45-minute run-time.

The 360 Eye's camera can really only see as high as a room's walls, which it photographs up to 30 times per second. Those images are processed by a special algorithm to detect and track distinct corners, like you'd find on tables, windows, or even paintings on a wall, which the robot uses to keep tabs on where it is, where it's been, and what's left to clean.
A simple map of a room is built up as the robovac navigates a space, but is wiped from the bot's memory after a cleaning cycle is complete. This makes it better suited for a home where things are constantly getting moved, creating new obstacles for the robovac to navigate every time it starts cleaning.
The 360 Eye adds extra collision security in the form of infra-red sensors. For the most part, the combination of these two technologies worked seamlessly, and on many occasions I was surprised at how deftly the tiny robovac was able to tightly navigate around table legs and other hard-to-spot obstacles. Collisions did occur from time to time, but thanks to the bot's small form factor, there was barely an impact.
The 360 Eye met its match when cleaning underneath an Ikea chair. It ended up beaching itself on a wooden crossbeam that it didn't see coming. Before I got up to rescue it, the robot just sat there, happily sucking away without moving for about five minutes.

It also had hang ups in dark spaces. On several occasions, while cleaning underneath a piece of furniture it was barely able to squeeze under, the Dyson 360 Eye needed rescuing. Presumably because its 360-degree camera was essentially blinded. The camera is a key part of its ability to navigate a room, and as a result, the robovac won't even turn on if there's not enough light for its camera to work. If you want to schedule it to clean the living room at three in the morning while you're asleep, you'll need to leave some lights on.
Yet these problems could potentially be resolved in future software updates, which the Dyson 360 Eye receives via Wi-Fi. The inclusion of Wi-Fi also allows the 360 Eye to be activated, monitored, and scheduled from the Dyson Link app on iOS or Android devices.

Pairing the app to the 360 Eye was a little tricky, but only because the app looked like it had failed when in reality it had successfully connected to the robovac, and functionality is limited. The most complex thing you can do through the app is schedule the robot to clean throughout the week. It does show you the map of a room it created after a cleaning is complete, so you can see what areas it might have missed. But it feels like a half-feature because you can't then click on the map and direct the robot back to a certain area.

On the underside of the 360 Eye you'll find a pair of metal contacts the robot vacuum uses for charging, its spinning brush bar, and a pair of bright blue rubber tank treads.

They might be more complicated than a simple pair of wheels (more parts means more parts that can break), but the treads also provide better grip since there's more surface area making contact with your floors, and the large teeth improve the 360 Eye's ability to clamber over obstacles, and transition from hard floors to carpeting. They also help the robovac maintain a straighter course—taking the tiny bot smoothly to its tiny charging base, which easily unfolds and sidles up against a wall.

Because it's first and foremost a Dyson vacuum, running off the company's tiny but mighty V2 digital motor, the 360 Eye sucks up dirt and debris as efficiently as any of the company's manual vacuums.

The spinning disks of whiskers used by robots like the Roomba to sweep debris from the edges of the bot inwards don't exist on the 360 Eye. Instead it features the same edge-to-edge brushbar that the company's manual vacs use so that it cleans as close to the edge of a wall as possible. It still leaves about a half-inch gap, but its ability to suck in dirt and debris along walls easily outperformed other robovacs I've tested.

After using the Dyson 360 Eye for some time, I can understand why the company decided to focus on its ability to clean. That's where its competitors have made compromises, which makes no sense for a product that's supposed to save you work and make your life easier. But there are a few features I would like to see added to help justify the 360 Eye's $1,000 price tag.
The ability to manually steer the robot from the app to hit missed spots, or move it to another room, would be helpful. For comparison, Samsung's $1000 PowerBOT VR9000 can follow a red crosshair projected on floors to help it navigate to a specific area. That's a genuinely useful feature—not a gimmick. There's also no way to limit where the Eye 360 is cleaning except for setting up physical obstacles in doorways to keep it contained, and notifications, or an alarm, for when the robot got stuck, would be useful too.
Of all the robot vacuums I've tested, Dyson's 360 Eye is the first that will genuinely clean your floors as well as a manual vacuum cleaner can. That being said, it won't completely eliminate vacuuming from your weekly chore list. It will save you a lot of time, though, which is what Dyson is really selling here for $1,000. The company's first robot vacuum feels a little light on features given the steep price tag, but through software updates and improvements to its app, eventually you could, one day, never need to touch a vacuum ever again.

Replacing surgical staff with automated technologies is becoming a more realistic prospect
www.matthewcattellphotography.com posted a photo:
Two fallow deer bucks at dawn as a red sun rose over the surrounding tree-tops.

Naarden is a star fort in the Netherlands. The city was constructed in the manner seen here so that an attack on any individual wall could be defended from the two adjacent star points by shooting at the enemy from behind. Today Naarden is home to roughly 17,000 residents. /// Source imagery: @digitalglobe (at Naarden)
Think summer's hot on Earth? Space physicists tracking weather on Jupiter say the roar of the raging storm we call the Great Red Spot heats the outer atmosphere above it by more than 1,000 degrees F.
Jupiter's Great Red Spot may be responsible for stirring an atmospheric hotspot into a frenzy, causing it to be hundreds of degrees warmer than anywhere else on the planet.…
The Australian | Technology: facial recognition to eye scans and thought control The Australian And your home robot slinks around the corner, out of sight, having discerned you are in a filthy mood. This isn't telepathy. It isn't the distant future. It's part of how we are about to communicate with electronic devices. It's potentially our most ... |
BT.com | Scientists say last goodbye to Philae lander BT.com Scientists have said a goodbye for good to Philae, the European robot lander that made history by bouncing onto the surface of a comet. 0. Share this. Facebook; Twitter; Google plus; Email; Share. 0. The Philae lander became unresponsive and ... Rosetta's comet lander Philae sends final tweet before losing contact with EarthDaily Mail Farewell Philae: Earth severs link with silent probe on cometABC Online Goodbye to Philae: What did we learn from this comet hunter?Christian Science Monitor Phys.Org -NBCNews.com -NASASpaceflight.com -CNET all 49 news articles » |

My Planet Experience posted a photo:
The Andean condor is considered endangered but is in far better shape than its California cousin. Perhaps a few thousand South American birds survive, and reintroduction programs are working to supplement that number.
These long-lived birds have survived over 75 years in captivity, but they reproduce slowly. A mating pair produces only a single offspring every other year, and both parents must care for their young for a full year.
© www.myplanetexperience.com
-- This feed and its contents are the property of The Huffington Post, and use is subject to our terms. It may be used for personal consumption, but may not be distributed on a website.
-- This feed and its contents are the property of The Huffington Post, and use is subject to our terms. It may be used for personal consumption, but may not be distributed on a website.
Read more: Forests, Environment, Sustainability, Bears, Fairy Tales, Animation, Bolala, Green News
Read more: Conservation, Endangered Species, Green News
When NASA's Dawn spacecraft arrived to orbit the dwarf planet Ceres in March 2015, mission scientists expected to find a heavily cratered body generally resembling the protoplanet Vesta, Dawn's previous port of call. Instead, as the spacecraft drew near to Ceres, a somewhat different picture began to emerge: Something has happened to Ceres to remove its biggest impact basins.
Now, writing in the online journal Nature Communications, a team of Dawn scientists led by Simone Marchi of the Southwest Research Institute in Boulder, Colorado, reports on their computer simulations of Ceres' history. These suggest that Ceres has experienced significant geological evolution, possibly erasing the large basins.The Dawn team includes Arizona State University's David Williams, who is the director of the Ronald Greeley Center for Planetary Studies in ASU's School of Earth and Space Exploration. Wiliams oversees a team of researchers using Dawn data to map the geology of Ceres.
He says, "When we first starting looking at Ceres images, we noticed that there weren't any really large impact basins on the surface." None are larger than 177 miles (285 kilometers) across. This presents a mystery, he says, because Ceres must have been struck by large asteroids many times over its 4.5-billion-year history.
"Even Vesta, only about half of Ceres' size, has two big basins at its south pole. But at Ceres, all we saw was the Kerwan Basin, just 177 miles in diameter," Williams says. "That was a big red flag that something had happened to Ceres."
The Kerwan Basin's name was proposed by Williams, and it commemorates the Hopi Indian spirit of the sprouting corn.
Dawn lead investigator Marchi notes, "We concluded that a significant population of large craters on Ceres has been obliterated beyond recognition over geological time scales, which is likely the result of Ceres' peculiar composition and internal evolution."
The team's simulations of collisions with Ceres predicted that it should have 10 to 15 craters larger than 250 miles (400 kilometers) in diameter, and at least 40 craters larger than 60 miles (100 kilometers) wide. In reality, however, Dawn found that Ceres has only 16 craters larger than 60 miles, and none larger than the 177-mile Kerwan Basin.
Further study of Dawn's images revealed that Ceres does have three large-scale depressions called "planitiae" that are up to 500 miles (800 kilometers) wide. These have craters within them that formed in more recent times, but the depressions could be left over from bigger impacts.
One of the depressions, called Vendimia Planitia, is a sprawling area just north of the Kerwan Basin. Vendimia Planitia must have formed much earlier than Kerwan.
So what removed Ceres' large craters and basins? "If Ceres were highly rocky, we'd expect impact craters of all sizes to be preserved. Remote sensing from Earth, however, told us even before Dawn arrived that the crust of Ceres holds a significant fraction of ice in some form," Williams explains.
If Ceres' crust contained a large proportion of ice -- especially if mixed with salts -- that would weaken the crust and let the topography of a large basin relax and become smoother, perhaps even disappear.
In addition, says Williams, Ceres must have generated some internal heat from the decay of radioactive elements after it formed. This too could also have helped soften or erase large-scale topographic features.
He adds, "Plus we do see evidence of cryovolcanism -- icy volcanism -- in the bright spots found scattered over Ceres, especially in Occator Crater." Cryovolcanism behaves like the rocky kind, only at much lower temperatures, where "molten ice" -- water or brine -- substitutes for molten rock.
"It's possible that there are layers or pockets of briny water in the crust of Ceres," says Williams. "Under the right conditions, these could migrate to the surface and be sources for the bright spots."
For example, in Occator Crater, he points out, "the central bright spot is a domed feature which looks as if it has erupted or been pushed up from below."
NASA plans for Dawn to continue orbiting Ceres as the dwarf planet makes its closest approach to the Sun in April 2018. Scientists want to see if the increasing solar warmth triggers any activity or produces detectable changes in Ceres' surface.
"Ceres is revealing only slowly the answers to her many mysteries," Williams says. "Completing the geological maps over the next year, and further analysis of the compositional and gravity data, will help us understand better Ceres' geologic evolution."
The Daily Galaxy via University of Arizona
This past March, Robin Li Yanhong, the founder and chief executive of China's Google, the online search giant Baidu, announced that he is looking to the nation's military to support the China Brain Project to make the mainland the world leader in developing artificial intelligence (AI) systems. It will be a massive, "state-level" initiative that could be comparable to how the Apollo space program to land the first humans on the moon in 1969.
Earlier in January of 2016 heoretical physicist Stephen Hawking warned this past January, 2016 that blindly embracing pioneering technology could trigger humanity's annihilation."The primitive forms of artificial intelligence we already have, have proved very useful. But I think the development of full artificial intelligence could spell the end of the human race," Hawking told the BBC in 2014. "Once humans develop artificial intelligence it would take off on its own, and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn't compete and would be superseded."
Artificial intelligence will surpass human intelligence after 2020, predicts Vernor Vinge, a world-renowned pioneer in AI, who has warned about the risks and opportunities that an electronic super-intelligence would offer to mankind. "It seems plausible that with technology we can, in the fairly near future," says scifi legend Vernor Vinge, "create (or become) creatures who surpass humans in every intellectual and creative dimension. Events beyond such an event -- such a singularity -- are as unimaginable to us as opera is to a flatworm."
There was the psychotic HAL 9000 in "2001: A Space Odyssey," the humanoids which attacked their human masters in "I, Robot" and, of course, "The Terminator", where a robot is sent into the past to kill a woman whose son will end the tyranny of the machines.
Experts interviewed by AFP were divided. Some agreed with Hawking, saying that the threat, even if it were distant, should be taken seriously. Others said his warning seemed overblown. "I'm pleased that a scientist from the 'hard sciences' has spoken out. I've been saying the same thing for years," said Daniela Cerqui, an anthropologist at Switzerland's Lausanne University.
Gains in AI are creating machines that outstrip human performance, Cerqui argued. The trend eventually will delegate responsibility for human life to the machine, she predicted. "It may seem like science fiction, but it's only a matter of degrees when you see what is happening right now," said Cerqui. "We are heading down the road he talked about, one step at a time."
Nick Bostrom, director of a program on the impacts of future technology at the University of Oxford, said the threat of AI superiority was not immediate. Bostrom pointed to current and near-future applications of AI that were still clearly in human hands -- things such as military drones, driverless cars, robot factory workers and automated surveillance of the Internet. But, he said, "I think machine intelligence will eventually surpass biological intelligence -- and, yes, there will be significant existential risks associated with that transition."
Other experts said "true" AI -- loosely defined as a machine that can pass itself off as a human being or think creatively -- was at best decades away, and cautioned against alarmism.
Since the field was launched at a conference in 1956, "predictions that AI will be achieved in the next 15 to 25 years have littered the field," according to Oxford researcher Stuart Armstrong. "Unless we missed something really spectacular in the news recently, none of them have come to pass," Armstrong says in a book, "Smarter than Us: The Rise of Machine Intelligence."
Jean-Gabriel Ganascia, an AI expert and moral philosopher at the Pierre and Marie Curie University in Paris, said Hawking's warning was "over the top. Many things in AI unleash emotion and worry because it changes our way of life," he said. "Hawking said there would be autonomous technology which would develop separately from humans. He has no evidence to support that. There is no data to back this opinion."
"It's a little apocalyptic," said Mathieu Lafourcade, an AI language specialist at the University of Montpellier, southern France. "Machines already do things better than us," he said, pointing to chess-playing software. "That doesn't mean they are more intelligent than us."
Allan Tucker, a senior lecturer in computer science at Britain's Brunel University, took a look at the hurdles facing AI. Recent years have seen dramatic gains in data-processing speed, spurring flexible software to enable a machine to learn from its mistakes, he said. Balance and reflexes, too, have made big advances. Tucker pointed to the US firm Boston Dynamics as being in the research vanguard. "These things are incredible tools that are really adaptative to an environment, but there is still a human there, directing them," said Tucker. "To me, none of these are close to what true AI is."
Tony Cohn, a professor of automated reasoning at Leeds University in northern England, said full AI is "still a long way off... not in my lifetime certainly, and I would say still many decades, given (the) current rate of progress." Despite big strides in recognition programmes and language cognition, robots perform poorly in open, messy environments where there are lots of noise, movement, objects and faces, said Cohn.
Such situations require machines to have what humans possess naturally and in abundance -- "commonsense knowledge" to make sense of things. Tucker said that, ultimately, the biggest barrier facing the age of AI is that machines are... well, machines. "We've evolved over however many millennia to be what we are, and the motivation is survival. That motivation is hard-wired into us. It's key to AI, but it's very difficult to implement."
"The Singularity" is seen by some as the end point of our current culture, when the ever-accelerating evolution of technology finally overtakes us and changes everything. It's been represented as everything from the end of all life to the beginning of a utopian age, which you might recognize as the endgames of most other religious beliefs.
While the definitions of the Singularity are as varied as people's fantasies of the future, with a very obvious reason, most agree that artificial intelligence will be the turning point. Once an AI is even the tiniest bit smarter than us, it'll be able to learn faster and we'll simply never be able to keep up. This will render us utterly obsolete in evolutionary terms, or at least in evolutionary terms.
Susan Schneider of the University of Pennsylvania is one of the few thinkers—outside the realm of science fiction— that have considered the notion that artificial intelligence is already out there, and has been for eons.
Her recent study, Alien Minds, Schneider asks: "how might aliens think? And, would they be conscious? I do not believe that most advanced alien civilizations will be biological, Schneider says. The most sophisticated civilizations will be postbiological, forms of artificial intelligence or Alien superintelligence."
Search for Extraterrstrial Intelligence (SETI) programs have been searching for biological life. Our culture has long depicted aliens as humanoid creatures with small, pointy chins, massive eyes, and large heads, apparently to house brains that are larger than ours. Paradigmatically, they are “little green men.” While we are aware that our culture is anthropomorphizing, Schneider imagines that her suggestion that aliens are supercomputers may strike us as far-fetched. So what is her rationale for the view that most intelligent alien civilizations will have members that are superintelligent AI?
Schneider presents offer three observations that together, support her conclusion for the existence of alien superintelligence.
The first is "the short window observation": Once a society creates the technology that could put them in touch with the cosmos, they are only a few hundred years away from changing their own paradigm from biology to AI. This “short window” makes it more likely that the aliens we encounter would be postbiological.
The short window observation is supported by human cultural evolution, at least thus far. Our first radio signals date back only about a hundred and twenty years, and space exploration is only about fifty years old, but we are already immersed in digital technology, such as cell-phones and laptop computers.
Devices such as the Google Glass promise to bring the Internet into more direct contact with our bodies, and it is probably a matter of less than fifty years before sophisticated internet connections are wired directly into our brains.
Today's Most Popular
The Daily Galaxy via AFP and South China Morning Post
Image credit: With thanks to Paul Imre
europeanspaceagency posted a photo:
The docking ring used by ESA's Automated Transfer Vehicle cargo spacecraft for five missions to the International Space Station is displayed in the laboratory corridor of ESA's technical heart in the Netherlands.
Supplied by Russia's space agency, and carried by Russia's own ferry craft, it is designed to work with docking ports on the Russian part of the Space Station.
The extended probe made contact with the Station's receptor and then retracted to join the vehicles together.
Sensors on the ring detected that the interface was safely tightened, after which a set of four hooks engaged to strengthen hold of the 20-tonne ATV on the orbital complex. Four further hooks extended from the Station side for a firm grip.
Embedded within the ring are electrical and data connections so that ATV could receive power from the Station and their computers could communicate. Fluid links transferred propellants and air into the Station's tanks.
The ring also includes the hatch for the crew to enter and unload the ferry. At the end of ATV's mission, springs gently pushed it away from the Station without the need for firing any thrusters.
Tours of ESA's site offered by the Space Expo visitor centre include the laboratory corridor. Space Expo also has an example of the Station side of the docking system on display.
The docking ring will also be on show to visitors during this year's Open Day on Sunday 2 October to register to visit, click here.
Credit: ESAG. Porter
Light from a distant galaxy can be strongly bent by the gravitational influence of a foreground galaxy, an effect called strong gravitational lensing. Normally a single galaxy is lensed at a time. The same foreground galaxy can - in theory - simultaneously lens multiple background galaxies. Although extremely rare, such a lens system offers a unique opportunity to probe the fundamental physics of galaxies and add to our understanding of cosmology. One such lens system has recently been discovered and the discovery was made not in an astronomer's office, but in a classroom. It has been dubbed the Eye of Horus, and this ancient eye in the sky will help us understand the history of the universe.
The image below is a schematic diagram showing the location of galaxies creating the gravitational lens effect of Eye of Horus. A galaxy 7 billion light years from the Earth bends the light from the two galaxies behind it at a distance of 9 billion light years and 10.5 billion light years, respectively. (NAOJ)
Subaru Telescope organizes a school for undergraduate students each year. One such session was held in September 2015 at the NAOJ headquarters in Mitaka, Tokyo (Fig. 2). Subaru is currently undertaking a massive survey to image a large area of the sky at an unprecedented depth with Hyper Suprime-Cam as part of the Subaru Strategic Program. A group of astronomers and young students were analyzing some of that Hyper Suprime-Cam data at the school when they found a unique lens system. It was a classic case of a serendipitous discovery.
"When I was looking at HSC images with the students, we came across a ring-like galaxy and we immediately recognized it as a strong-lensing signature," said Masayuki Tanaka, the lead author of a science paper on the system's discovery. "The discovery would not have been possible without the large survey data to find such a rare object, as well as the deep, high quality images to detect light from distant objects."
Arsha Dezuka, a student who was working on the data, was astonished at the find. "It was my first time to look at the astronomical images taken with Hyper Suprime-Cam and I had no idea what the ring-like galaxy is," she said. "It was a great surprise for me to learn that it is such a rare, unique system!"
A close inspection of the images revealed two distinct arcs/rings of light with different colors. This strongly suggested that two distinct background galaxies were being lensed by the foreground galaxy. The lensing galaxy has a spectroscopic redshift of z = 0.79 (which means it's 7.0 billion light-years away, Note 1) based on data from the Sloan Digital Sky Survey. Follow-up spectroscopic observations of the lensed objects using the infrared-sensitive FIRE spectrometer on the Magellan Telescope confirmed that there are actually two galaxies behind the lens. One lies at z = 1.30 and the other is at z = 1.99 (9.0 and 10.5 billion light-years away, respectively).
"The spectroscopic data reveal some very interesting things about the background sources," said Kenneth Wong from NAOJ, the second author of the scientific paper describing the system. "Not only do they confirm that there are two sources at different distances from us, but the more distant source seems to consist of two distinct clumps, which could indicate an interacting pair of galaxies. Also, one of the multiple images of that source is itself being split into two images, which could be due to a satellite galaxy that is too faint for us to see."
The distinct features for the system (several bright knots, an arc, a complete Einstein ring) arise from the nice alignment of the central lens galaxy and both sources, creating an eye-like structure (Fig.3). The astronomers dubbed it Eye of Horus, for the sacred eye of an ancient Egyptian god, since the system has an uncanny resemblance to it.
The survey with Hyper Suprime-Cam is only 30% complete and it will collect data for several more years. Astronomers expect to find roughly 10 more such systems in the survey, which will provide important insights into the fundamental physics of galaxies as well as how the universe expanded over the last several billion years.
The "Canarias Einstein ring" shown at the top of the page is one of the most symmetrical discovered until now and is almost circular, showing that the two galaxies are almost perfectly aligned, with a separation on the sky of only 0.2 arcseconds. The source galaxy is 10,000 million light years away from us. Due to the expansion of the Universe, this distance was smaller when its light started on its journey to us, and has taken 8,500 million years to reach us. We observe it as it was then: a blue galaxy which is beginning to evolve, populated by young stars which are forming at a high rate. The lens galaxy is nearer to us, 6,000 million light years away, and is more evolved. Its stars have almost stopped forming, and its population is old.
The Daily Galaxy via National Institutes of Natural Sciences
