Amazing DNA Breakthrough Means Incredible Computers Ahead
Researchers at Harvard University have recorded the first digital movie recording in the DNA of bacteria, including having successfully retrieved the recording for playback.read more
Researchers at Harvard University have recorded the first digital movie recording in the DNA of bacteria, including having successfully retrieved the recording for playback.read more
An engineering research team in Japan has developed a new solar cell that may be able to raise the efficiency of photovoltaic cells above 50 percent, and theoretically as high as 63 percent under certain conditions. This is important as consumer-grade cells are hindered by a lower maximum efficiency of around 26 percent, with most cells on the market only boasting an efficiency of 12 to 18 percent. If that value can be improved, it will make for greater accessibility for consumers looking toward renewable energy sources, and drastically improve the output of commercial solar farms.
read more
A university professor from New Zealand is planning to put our modern knowledge of genetics to work in solving a decades-old mystery: does the Loch Ness Monster, nicknamed ‘Nessie’, actually exist?
The University of Otago’s Professor Neil Gemmell is proposing that new genomic forensic techniques be used to search for the elusive creature. While Nessie gained widespread popularity via the oft-debunked "surgeon’s photograph" published in 1934, legends of a large creature living in the lake predate the famous picture. Numerous sightings have been reported over the past century, along with the publication of dozens of photographs that allegedly depict Nessie.
read more
One of the biggest barriers to the implementation of useful artificial intelligence in our culture is the limitations imposed by our computer hardware: modern computer chips have their circuits arranged in a two-dimensional layout, running programs that are meant to mimic our own three-dimensional neurological processes. The 2D setup was, and indeed still is, better suited to the more linear processing that the majority of our computer programs require, but running AI-based programs presents a sizable drop in efficiency — it is apparent that if AI is to grow as a valuable tool, a new form of computer hardware will be required to accommodate it.
read more