Advances in Computers thread

Inside Project Loon: Google's internet in the sky is almost open for business

“Good news,” says Katelin Jabbari, Google X’s communications chief. “It’s about to explode.”

We’re several hundred feet in the air, inching our way along a wooden walkway tucked high into the rafters of the massive hangar at Moffett Federal Airfield, where Google’s most outlandish and secretive division has been testing new prototypes for Project Loon. Below us, a pair of huge balloons sway gently on their tethers. Engineers are racing around them like ants.

Loon is being built with the audacious goal of beaming internet access down to the most remote parts of the planet, using specially equipped balloons that kiss the upper edges of Earth’s atmosphere.
 
http://mashable.com/...arrier-service/

Following nearly a year of rumors, Google confirmed on Monday that it plans to offer talk and data plans to customers.

Sundar Pichai, senior VP at Google, said on stage at Mobile World Congress during a presentation that the company is working on a wireless service on a "small scale."

Pichai outlined how the company aims to bring web connectivity to some of the 4 billion people currently without Internet access. Later this year, it will launch its first fleet of solar-powered drones into the sky as a part of its Project Titan program, he said.
 
Researchers develop the first-ever quantum device that detects and corrects its own errors
3 hours ago by Sonia Fernandez
strengthinnu.jpg

A photograph of the nine qubit device. The device conists of nine superconducting 'Xmon' transmon in a row. Qubits interact with their nearest neighbors to detect and correct errors. Credit: Julian Kelly
When scientists develop a full quantum computer, the world of computing will undergo a revolution of sophistication, speed and energy efficiency that will make even our beefiest conventional machines seem like Stone Age clunkers by comparison.

But, before that happens, quantum physicists like the ones in UC Santa Barbara's physics professor John Martinis' lab will have to create circuitry that takes advantage of the marvelous computing prowess promised by the quantum bit ("qubit"), while compensating for its high vulnerability to environmentally-induced error.

In what they are calling a major milestone, the researchers in the Martinis Lab have developed quantum circuitry that self-checks for errors and suppresses them, preserving the qubits' state(s) and imbuing the system with the highly sought-after reliability that will prove foundational for the building of large-scale superconducting quantum computers.



Read more at: Researchers develop the first-ever quantum device that detects and corrects its own errors
 
iSkin stickers could be used to control mobile devices
By Ben Coxworth
March 4, 2015
2 Pictures


While a wrist-worn smartwatch may be easier to access than a smartphone that has to be retrieved from a pocket, the things certainly have tiny screens. That could make them rather difficult to use for certain tasks, particularly ones where a larger interface area is needed. Well, that's where iSkin comes in. The experimental system allows users to control mobile devices using flexible, stretchable stickers that adhere to their skin.
 
Black phosphorus improves optical communication for chip interconnects


University of Minnesota researchers have found that an ultrathin black phosphorus film — only 20 layers of atoms — allows for high-speed data communication on nanoscale optical circuits. Black phosphorus is a crystaline form of the element phosphorus.
The devices showed vast improvement in efficiency over comparable devices using graphene.
The work by University of Minnesota Department of Electrical and Computer Engineering Professors Mo Li and Steven Koester and graduate students Nathan Youngblood and Che Chen was published Monday March 2 in Nature Photonics.
Chip-makers are attempting to cram more processor cores on a single chip, but getting all those processors to communicate with each other has been a key challenge for researchers. So the goal is to find materials that will allow high-speed, on-chip communication using light.
 
Glasses-free 3D display is made with tiny spherical lenses

One of the most common methods of creating the illusion of 3D is the autostereoscopic display, which is based on parallax: each eye is presented with a slightly different angle of a scene. Often this is done with many tiny microlenses, each projecting a small amount of light. Although this method has many advantages and is already being used in commercial products, such as the Nintendo 3DS, its narrow viewing angle is still a problem for expanding its use to larger displays.

In a new paper published in IEEE's Journal of Display Technology, researchers at Chengdu Technological University and Sichuan University, both in Chengdu, China, have addressed the narrow viewing angle problem by replacing the flat microlenses with microsphere lenses. They have built a prototype that demonstrates that the larger curvature of the spherical lenses increases the viewing angle from 20-30° to 32°, with a theoretical viewing angle of up to 90°.


Read more at: Glasses-free 3D display is made with tiny spherical lenses
 
Didn't link but saw articles where the new Apple Watch on has a 3 hour battery life!!!
 
Review: Eyefi's Mobi Pro card can wirelessly send your RAW photos
By Simon Crisp
March 12, 2015
15 Pictures

Eyefi wireless SD cards have made it possible for cameras without built-in Wi-Fi to transfer images sans wires since 2007. However, the firm's recent Mobi cards forwent more advanced features in the name of simplicity and smartphone connectivity. The new Mobi Pro card promises to be just as easy to use, but also boasts higher-end features in a bid to appeal to enthusiast and professional photographers. We spent a bit of time with the Eyefi Mobi Pro ahead of its release to check it out.
 
IBM to demonstrate first on-package silicon photonics
IBM to demonstrate first on-package silicon photonics ExtremeTech

One of the most tantalizing next-generation technologies that could dramatically reduce system power consumption and improve bandwidth is silicon photonics. This method of chip-to-chip communication uses silicon as an optical medium, and transmits data incredibly quickly with far better power consumption and thermals than traditional copper wires. Now, IBM is claiming to have advanced the technology a significant step by integrating a silicon photonic chip on the same package as a CPU.

To-date, silicon photonics has been a major research area for HPC and exascale computing operations, where the technology is seen as essential for the long-term progress of supercomputing.


As this chart shows, hitting one exaflop (a goal DARPA head Bob Colwell thinks is unlikely in any case) will require far more bandwidth and much higher efficiency. We need photonic links that can offer orders of magnitude more connectivity at 1 mW per gigabit of bandwidth and at a cost of 2.5 cents per gigabit as compared to $10 today.
 
China's Alibaba shows off pay-with-your-face technology at IT fair
1 hour ago by Romain Fonsegrives
thefounderan.jpg

The founder and executive chairman of Alibaba Group, Jack Ma, speaks during the official opening of the CeBIT technology fair in Hanover, Germany on March 15, 2015
China's Internet tycoon Jack Ma, founder of giant online merchant Alibaba, gave a glimpse of the future when he demonstrated a new e-payment system using facial recognition at the CeBIT IT fair in Germany.



Read more at: China s Alibaba shows off pay-with-your-face technology at IT fair
 
15th post
When IBM sold out the pc line to China it unleashed inventiveness in ways never dreamed possible.

Recently one privately owned Chinese company was about to market an injectable computer but abandoned it when market research showed the limo liberals who could afford it would prefer one in suppository form. They were already accustomed to getting their information out of each other's asses.
 
Nvidia’s 2016 roadmap shows huge performance gains from upcoming Pascal architecture
At Nvidia’s keynote today to kick off GTC, CEO Jen-Hsun Huang spent most of his time discussing Nvidia’s various deep learning initiatives and pushing the idea of Tegra as integral to the self-driving car. He did, however, take time to introduce a new Titan X GPU — and to discuss the future of Nvidia’s roadmap.

When Nvidia’s next-generation GPU architecture arrives next year, codenamed Pascal, it’s going to pack a variety of performance improvements for scientific computing — though their impact on the gaming world is less clear.


Let’s start at the beginning:



Pascal is Nvidia’s follow-up to Maxwell, and the first desktop chip to use TSMC’s 16nmFF+ (FinFET+) process. This is the second-generation follow-up to TSMC’s first FinFET technology — the first generation is expected to be available this year, while FF+ won’t ship until sometime next year. This confirms that Nvidia chose to skip 20nm — something we predicted nearly three years ago.
 
Real-time holographic displays one step closer to reality
9 hours ago
realtimeholo.jpg

Rendered schematic of holographic pixels in operation showing switching states. Credit: Calum Williams
Researchers from the University of Cambridge have designed a new type of pixel element and demonstrated its unique switching capability, which could make three-dimensional holographic displays possible.

Real-time dynamic holographic displays, long the realm of science fiction, could be one step closer to reality, after researchers from the University of Cambridge developed a new type of pixel element that enables far greater control over displays at the level of individual pixels. The results are published in the journal Physica Status Solidi.



Read more at: Real-time holographic displays one step closer to reality
 
Samsung can put 128GB of storage in your low-cost phone

jon-fingas-january-2013_24x24.jpg
by Jon Fingas | @jonfingas | 5hrs ago

samsung-galaxy-a5-samsungtomorrow.jpg

Just because you're not splurging on a top-of-the-line smartphone doesn't mean that you have to settle for a tiny amount of storage. Samsung certainly thinks that way -- it just announced a 3-bits-per-cell flash memory chip that promises 128GB of storage in "mass market" (read: more affordable) mobile devices. It's based on the plain eMMC tech you see in most phones instead of the fast UFS format inside the Galaxy S6, but you probably won't complain about the speed when it can still read sequential data at a very respectable 260MB per second. The one catch? There's no word on when it'll be ready, so you may be waiting a while before you're carrying a budget phone with more drive space than some laptops.
 

New Topics

Back
Top Bottom