The first asteroid from outside the solar system is like nothing we’ve ever seen

The first asteroid from outside the solar system is like nothing we’ve ever seen

In mid-October 2017, a telescope in Hawai’i spotted a speck of light passing across the sky. Astronomers first thought it was normal asteroid, albeit one moving pretty fast, but after a few days of observations they threw that theory out of the window. The object was not from our solar system at all.

In the days that followed, the European Southern Observatory (ESO) brought its Very Large Telescope to bear on the object – which by this point had been named 1I/2017 U1, or “`Oumuamua”. 

“We had to act quickly,” said team member Olivier Hainaut from ESO in Garching, Germany. “`Oumuamua had already passed its closest point to the Sun and was heading back into interstellar space.”

Orbit, brightness and color

By combining images captured through four different filters with pictures from other large telescopes, the team measured `Oumuamua’s orbit, brightness and color. They detected that it varies dramatically in brightness as it spins on its axis every 7.3 hours.

Karen Meech, who led the team, explained what this means: “This unusually large variation in brightness means that the object is highly elongated: about ten times as long as it is wide, with a complex, convoluted shape. We also found that it has a dark red color, similar to objects in the outer Solar System, and confirmed that it is completely inert, without the faintest hint of dust around it.”

The astronomers believe that the asteroid, which is about 400 metres long, obtained its reddish colour from the bombardment of cosmic rays over the millions of years that it has spent in deep space.

Where did it come from?

Orbital data suggests that it came from what is now the direction of the star Vega, but Vega is so far away that it wouldn’t have actually been in that position when the object left. As such, `Oumuamua may have been spinning through the Milky Way for hundreds of millions of years before we found it in our celestial garden.

We’re not entirely sure how common it is for interstellar objects like this one to visit us. Astronomers believe it might be as regular as once per year, but it’s only recently that we’ve been able to build telescopes powerful enough to detect them.

“We are continuing to observe this unique object,” said Hainaut, “and we hope to more accurately pin down where it came from and where it is going next on its tour of the galaxy. And now that we have found the first interstellar rock, we are getting ready for the next ones!”


Powered by WPeMatico

This nanotube motion sensor could make wearables cheaper

This nanotube motion sensor could make wearables cheaper

Wearable devices are still kinda expensive for several reasons. First, the components need to be small. Second, the components need to be pretty resilient to getting bashed about a little during daily use. And finally because people aren’t buying that many of them yet, so economies of scale aren’t there.

Now, a team from Florida State University might have a partial answer to those first two problems, with a new class of motion sensor that could make wearables do more for less.

Richard Liang, director of the High-Performance Materials Institute and professor at the FAMU-FSU College of Engineering, has led the development of an advanced series of motion sensors made using buckypaper – thin, flexible sheets of carbon nanotubes.

Silver ink

Inside the sensors, a strip of buckypaper is combined with silver ink electrodes that can be printed on a normal inkjet printer. The result is a sensor that is more sensitive than the flexible metallic sensors used in some wearables, but not as rigid and fragile as more-sensitive semiconductor sensors.

“We measure sensors by gauge factor, which indicates how much resistance value changes as a material is strained or bent,” said doctoral candidate Joshua DeGraff, the lead author of a paper describing the new technology.

“Our gauge factor has been up to eight times higher than commercial sensors and 75 percent higher than many other carbon nanotube sensors.”

Next steps

The next step in developing the sensor will be to further improve on the thickness of the device, allowing it to be integrated into clothing, as well as testing it on more complex models. 

“For sensor technology, you need it to be flexible, you need it to be affordable and you need it to be scalable,” said Liang. 

“This new technology is versatile and the sensors are affordable to print. It’s a big innovation that presents many possibilities down the road.”


Powered by WPeMatico

Smarter people are better at League of Legends and Dota 2

Smarter people are better at League of Legends and Dota 2

Some games are harder than others, but one particular genre – the multiplayer online battle arena (or MOBA for short) – is notorious for its punishing difficulty curve. Memorizing the skills and abilities of tens of characters, as well as the functions of hundreds of spells, items and other mechanics is no easy task.

So it’s probably not surprising to learn that researchers have found a correlation between people who perform well at MOBAs like League of Legends and Dota 2, and people who perform well in intelligence tests.

A team from York University’s departments of Psychology and Computer Science performed an experiment where people who were highly experienced with League of Legends were asked to complete a standard pen-and-paper intelligence test. They found a correlation between performance in the game and performance in the tests.

Multiple factors

Then, in a second study, they analysed data from two MOBA games: League of Legends and Dota 2, as well as two first person shooters: Destiny and Battlefield 3. Here, they found that in groups of thousands of players, performance in MOBAs and IQ tests behave in similar ways as players get older.

But interestingly, the same effect wasn’t found for first-person shooter games. In these, performance declined after the player’s teenage years, even while their scores on IQ tests remained high.

“Unlike First Person Shooter (FPS) games where speed and target accuracy are a priority, Multiplayer Online Battle Arenas rely more on memory and the ability to make strategic decisions taking into account multiple factors,” said Athanasios Kokkinakis, a PhD student with the EPSRC Centre for Intelligent Games and Game Intelligence at the University of York and lead author of a paper describing the discoveries, published in PLOS ONE

Competitive complexities

Alex Wade of the University of York’s Department of Psychology and Digital Creativity Labs added: “Games such as League of Legends and DOTA 2 are complex, socially-interactive and intellectually demanding. Our research would suggest that your performance in these games can be a measure of intelligence.”

“Research in the past has pointed to the fact that people who are good at strategy games such as chess tend to score highly at IQ tests. Our research has extended this to games that millions of people across the planet play every day.”


Powered by WPeMatico

Biologists have made a beetle with three eyes

Biologists have made a beetle with three eyes

Biologists at Indiana University have created a beetle with a functional extra eye, in the hope of studying the genetic building blocks that define how the insect head develops.

The research, which was published in the Proceedings of the National Academy of Sciences, built on previous experiments that accidentally produced an extra eye. Or technically, the “fusion” of two sets of additional eyes.

To create the three-eyed beetle, the scientists used a simple genetic tool to deactivate a single gene in the insect’s genome. Previous research had shown the gene plays a role in telling the head how to form.

“This study experimentally disrupts the function of a single, major gene,” said Armin P. Moczek, a professor in the IU Bloomington College of Arts and Sciences’ Department of Biology. 

“And, in response to this disruption, the remainder of head development reorganises itself to produce a highly complex trait in a new place: a compound eye in the middle of the head.”

Nerve connections

In tests of the third eye, the team found that it had the same cell types, the same nerve connections and made the beetle respond in the same ways as the two other normal eyes.

“Developmental biology is beautifully complex in part because there’s no single gene for an eye, a brain, a butterfly’s wing or a turtle’s shell,” said Moczek.

“Instead, thousands of individual genes and dozens of developmental processes come together to enable the formation of each of these traits.”

Eduardo E. Zattara, lead author on the study, added: “The use of ectopic eyes is a highly accessible paradigm to study all of this, across many types of organisms. We regard this study as really opening the door to new avenues of investigation in multiple disciplines.”


Powered by WPeMatico

The sooner we roll out autonomous cars, the sooner we start saving lives

The sooner we roll out autonomous cars, the sooner we start saving lives

Humans are bad drivers. You, personally, might be a great driver, but humans as a whole are terrible. So terrible that more than 35,000 people died on US roads alone last year as a result of mistakes that we made.

One of the most promising ways to bring this figure down is to let robots drive us instead. Self-driving cars see better, react faster and never get tired or drunk. But there’s a problem – people expect perfection from their autonomous chauffeurs, and this pursuit of perfection could actually stand in the way of saving lives.

“We lost 35,200 lives on our roads last year,” Mark Rosekind, when still chief regulator of the US National Highway Transportation Safety Administration, told a symposium last year: “If we wait for perfect, we’ll be waiting for a very, very long time. How many lives might we be losing if we wait?” 

Answering the Question

Now researchers from Rand Corporation have made a first attempt to answer that question.  Nidhi Kalra and David Groves have developed a set of tools that show that introducing autonomous vehicles when they are just a little bit better than human drivers could save hundreds of thousands of lives over 30 years, compared to waiting until they are perfect.

In tests involving 500 different future scenarios, permitting autonomous vehicles onto the roads sooner saved lives in the long term. In most scenarios they saved lives in the short-term too. You can play with different scenarios over on Rand’s website.

“This tool helps change the conversation from one focused on how safe the cars are when they’re introduced to one that considers how even small safety advantages now can grow into the future—saving lives along the way,” Groves said.

“It helps us ask a better question: What should we do today so that over time autonomous vehicles become as safe as possible as quickly as possible without sacrificing lives to get there?”

A full peer-reviewed report on the cost of waiting for nearly-perfect autonomous cars has been published on the Rand website. 


Powered by WPeMatico

Nasa’s next Mars rover has 23 eyes for scoping out the Red Planet

Nasa’s next Mars rover has 23 eyes for scoping out the Red Planet

If we were to judge the supremacy of interplanetary rovers by the number of cameras they had, Nasa’s Mars 2020 mission will be the undisputed king of the hill.

The rover has a grand total of 23 different “eyes”,  which will guide it as it performs its allocated tasks. From studying the atmosphere to keeping track of samples being processed inside the robot’s belly, almost everything that the rover does will be visually recorded.

The reason for this abundance? Smartphones. Well, kind of. Camera technology has taken great leaps and bounds over the last decade or so, chiefly led by the smartphone revolution. We can now see more clearly than ever with tinier and tinier devices. On a space mission, where weight and bulk are at a premium, that makes a real difference.

“Camera technology keeps improving,” said Justin Maki of JPL, Mars 2020’s imaging scientist. “Each successive mission is able to utilize these improvements, with better performance and lower cost.”

Digital eyes

While the 17 cameras mounted on Nasa’s Curiosity rover have allowed us to see the Red Planet in ways that we had never been able to before, their abilities will be substantially outstripped by Mars 2020’s digital eyes.

Among the new rover’s instruments will be engineering cameras (for driving and spotting obstacles) with a higher resolution and wider field of view than anything on Curiosity. “Our previous Navcams would snap multiple pictures and stitch them together,” said Colin McKinney of JPL, product delivery manager for the new engineering cameras. “With the wider field of view, we get the same perspective in one shot.”

It’ll also have six cameras that will record the entry, descent and landing process,  an improved version of Curiosity’s Mastcam with a 3:1 zoom lens, a suite of cameras for studying Mars’ clouds and atmosphere, and even a remote imager that can capture colour images.

Interplanetary internet

The only problem will be getting all these pictures back to Earth. 

“The limiting factor in most imaging systems is the telecommunications link,” Maki said. “Cameras are capable of acquiring much more data than can be sent back to Earth.”

Thankfully, the rudimentary interplanetary internet that exists between Earth and Mars lets us relay data through orbiting spacecraft, making it easier to send back data even if there’s no direct line of sight to the rover.

You can find all the technical detail that you might want on Mars 2020’s camera systems, including images of how they see the world, over on Nasa’s website


Powered by WPeMatico