Advances in Computers thread

Having taught computer basics, I'm waiting for an advance in users. Any progress there?

"Rather than bringing me closer to others, the time that I spend online isolates me from the most important people in my life, my family, my friends, my neighbourhood, my community." Clifford Stoll

"There is no escaping from ourselves. The human dilemma is as it has always been, and we solve nothing fundamental by cloaking ourselves in technological glory." Neil Postman

"We have created an industrial order geared to automatism, where feeble-mindedness, native or acquired, is necessary for docile productivity in the factory; and where a pervasive neurosis is the final gift of the meaningless life that issues forth at the other end." Lewis Mumford
 
Industry's most powerful server graphics card, exceeding one TFLOPS of peak double precision performance, introduced
November 12, 2012
Industry's most powerful server graphics card, exceeding one TFLOPS of peak double precision performance, introduced

AMD today launched the AMD FirePro S10000, the industry's most powerful server graphics card, designed for high-performance computing (HPC) workloads and graphics intensive applications. The AMD FirePro S10000 is the first professional-grade card to exceed one teraFLOPS (TFLOPS) of double-precision floating-point performance, helping to ensure optimal efficiency for HPC calculations. It is also the first ultra high-end card that brings an unprecedented 5.91 TFLOPS of peak single-precision and 1.48 TFLOPS of double-precision floating-point calculations. This performance ensures the fastest possible data processing speeds for professionals working with large amounts of information. In addition to HPC, the FirePro S10000 is also ideal for virtual desktop infrastructure (VDI) and workstation graphics deployments.
 
Bringing 'Minority Report' touchless gestures to Windows 8

Elliptic_Labs_Touchless_gesture_610x421.jpg


Elliptic Labs wants to bring the touchless gesture controls seen in the science-fiction film "Minority Report" to everyday consumer electronic devices, starting with Windows 8.

The company -- a Norwegian university spinout with offices in Oslo and Silicon Valley -- unveiled a set of tools to help consumer electronic companies enable touchless controls in their products. These would be similar to the kind of gesture controls seen with the Xbox 360 Kinect and in certain smart televisions like a few models from Samsung Electronics, but presumably would work more smoothly.

While the first step has been to integrate the controls with Windows 8 laptops and PCs, Kjolebakken said that he expects tablets and smartphones to eventually get the feature. Down the line, he sees the potential for cars to get gesture controls as well.

Bringing 'Minority Report' touchless gestures to Windows 8 | Cutting Edge - CNET News
 
Last edited:
This certainly pertains to computers.

Increasing efficiency of wireless networks
November 13, 2012 by Sean Nealon
Increasing efficiency of wireless networks

(Phys.org)—Two professors at the University of California, Riverside Bourns College of Engineering have developed a new method that doubles the efficiency of wireless networks and could have a large impact on the mobile Internet and wireless industries.

Efficiency of wireless networks is key because there is a limited amount of spectrum to transmit voice, text and Internet services, such as streaming video and music. And when spectrum does become available it can fetch billions of dollars at auction.

The "spectrum crunch" is quickly being accelerated as customers convert from traditional cell phones to smartphones and tablets. For example, tablets generate 121 times more traffic than a traditional cell phone.

Without making networks more efficient, customers are likely to drop more calls, pay more money for service, endure slower data speed and not see an unlimited data plan again.

The UC Riverside findings were outlined in a paper titled "A method for broadband full-duplex MIMO radio" recently published online in the journal IEEE Signal Processing Letters. It was co-authored by Yingbo Hua and Ping Liang, who are both electrical engineering professors, and three of their graduate students: Yiming Ma, Ali Cagatay Cirik and Qian Gao.

Current radios for wireless communications are half-duplex, meaning signals are transmitted and received in two separate channels. Full duplex radios, which transmit signals at the same time in the same frequency band, can double the efficiency of the spectrum.

The UC Riverside researchers have found a new solution called "time-domain transmit beamforming", which digitally creates a time-domain cancellation signal, couples it to the radio frequency frontend to allow the radio to hear much weaker incoming signals while transmitting strong outgoing signals at the same frequency and same time.

This new solution is indispensable for a full-duplex radio in general while it is complementary to other required solutions or components. The new solution not only has a sound theoretical proof, but also leads to a lower cost, faster and more accurate channel estimation for robust and effective cancellation.
 
Last edited:
New 3D gesture system could change how we use our phones


New 3D gesture system could change how we use our phones | DVICE

The Kinect has already shown us the beginnings of a new world of console gaming interfaces controlled by gestures, but a new development promises to bring that dynamic to tiny mobile devices using the world's first electrical field-based 3D gesture controller.

Developed by Microchip Inc., the GestIC system is designed to allow users to control their mobile devices with simple hand gestures. Instead of using a camera to track your hand movements, the controller maps them using electrical fields. The dynamic appears to be relatively precise, allowing users to swipe to turn pages, engage winding functions with a circular motion, and put the device to sleep with a simple hand wave.

Although the controller has a six-inch range of motion limitation because it uses electrical fields rather than cameras, its power usage is more efficient and thus a potentially better solution for mobile applications. Microchip Inc. expects to have the controller operating in a wide variety of commercially available mobile devices in 2013.

You can see a demonstration of how the controller works in the video below.
[ame=http://www.youtube.com/watch?v=Eyw6t85Ub6Y&feature=player_embedded]Microchip's Revised MGC3130 Demonstration - YouTube[/ame]
 
New WiFi protocol boosts congested wireless network throughput by 700%
By Sebastian Anthony on November 14, 2012 at 2:08 pm
2 Comments
New WiFi protocol boosts congested wireless network throughput by 700% | ExtremeTech

Engineers at NC State University (NCSU) have discovered a way of boosting the throughput of busy WiFi networks by up to 700%. Perhaps most importantly, the breakthrough is purely software-based, meaning it could be rolled out to existing WiFi networks relatively easily — instantly improving the throughput and latency of the network.

As wireless networking becomes ever more prevalent, you may have noticed that your home network is much faster than the WiFi network at the airport or a busy conference center. The primary reason for this is that a WiFi access point, along with every device connected to it, operates on the same wireless channel. A channel is basically a single-lane road, a lot like an electrical (copper wire) bus. Each channel, depending on the wireless technology being used, has a maximum bandwidth (say, 100 megabits per second), with that bandwidth being distributed between all connected devices.

At home, you might have exclusive use of that road, meaning you can drive as fast as you like and suck up every last megabit — but at a busy conference center, you are fighting tens or hundreds of people for space. In such a situation, your bandwidth allocation rapidly dwindles and your latency quickly climbs. This single-channel problem is also compounded by the fact that the road isn’t just one-way; the access point also needs to send data back to every connected device.

This will hopefully speed up large "hotspots" speed to where there's no difference between at home and at a airport.


Chemists create self-assembling polymer that increases hard drive capacity by 5X
By James Plafke on November 14, 2012 at 10:25 am
Comment
http://www.extremetech.com/computin...ymer-that-increases-hard-drive-capacity-by-5x


When you think of increased storage capacity, you most likely don’t think of self-assembling polymers that only require heat in order to rearrange themselves. However, with the potential to increase HDD storage capacity fivefold, researchers at the University of Texas might make self-assembling polymers the norm.

Currently, information is stored through the use of printing zeroes and ones as magnetic dots on a metal surface, with the amount of information able to be stored being relative to the spacing of the dots. The closer the dots, the more information can be stored. With current technology, the dots have become so close together that any decrease in the spacing between them would cause instability due to the neighboring dots’ magnetic fields. However, if there was a way protect the dots from neighboring magnetic fields, they could be moved even closer together, creating more storage space.

University of Texas chemists and engineers teamed up to apply a coat of a substance known as a block copolymer — a grouping of polymers made out of more than one bondable molecule — to a metal surface. If delicately coaxed, such as with a bit of heat, the block copolymers are able to reorganize themselves into a regular pattern. If a surface contains some kind of guide, the block copolymers can follow it. It just so happens that the magnetic dots on a hard drive provide the perfect guide for the block copolymers, and the copolymers provide just enough shielding from magnetic fields, thus being able to push the dots much closer together than the normal spacing, and without worry of data corruption.
 
Last edited:
Since the University students returned, my DSL bogs down big time at night. Called Mediacommie and he said "yup". Called another local provider that runs all that shit on a dedicated phone line.
I was told I won't be bogged down at night. Does that sound kosher?
 
First teleportation between macroscopic objects leads the way to a quantum internet
By James Plafke on November 15, 2012 at 1:53 pm
Comment
First teleportation between macroscopic objects leads the way to a quantum internet | ExtremeTech

The long-range teleportation barrier has already been broken multiple times, but the information being transported, such as a single quantum bit (qubit), has always been relatively small, usually between two photons. This time around, a team of physicists have managed to transport information from one macroscopic (visible to the naked eye) object to another for the first time, potentially leading us towards the first quantum network routers.

Qubits — the basis for quantum networking and computing — are highly unstable, and are destroyed by a single measurement. However, physicists have figured out how to send a qubit without destroying it through the use of teleportation, managing to send them over large distances in the past — once over a distance of 60 miles, and another over a distance of 89 miles. Essentially, two quantum objects are linked together, so a measurement of one affects the other, which has allowed physicists to teleport a qubit without it actually moving through the space between two locations.

However, up until now, each successful teleportation has been between either two microscopic objects, or one micro and one macroscopic object. Now, Xiao-Hui Bao and a team at the University of Science and Technology in China reported that they have managed to teleport information between two macroscopic objects — two groups of rubidium atoms — over a distance of 150 meters. Though it wasn’t over a distance of 60 or 89 miles, this was the first time quantum information has been teleported between two macroscopic objects, and at a macroscopic scale.
 
World's first stream aggregation technology to rapidly process both historical and incoming data


Characteristics of existing technologies and the new technology. Fujitsu Laboratories announced development of the world's first stream aggregation technology able to rapidly process both stored historical data and incoming streams of new data in a big data context.

The nature of big data requires that enormous volumes of data be processed at a high speed. When data is aggregated, longer aggregation times result in larger data volumes to be processed. This means computation times lengthen, which causes frequent updating operations to become more difficult. This is why improving the frequency of updates when aggregation times are lengthened has so far been challenging. Fujitsu Laboratories has therefore developed a technology that returns computation results quickly and manages snapshot operations, without re-doing computations or re-reading a variety of data types that change over time. As a result, even with high-frequency updating and long aggregation times, data can be processed 100 times faster than before. This technology promises to improve both large volumes of batch processing and the processing of streaming data. Furthermore, in meteorology, it is now possible to show concentrated downpours in specific areas. As well as the utility gained for future weather forecasting, it may also have uses in new fields that demand the ability to process longitudinal data in real time.

Read more at: World's first stream aggregation technology to rapidly process both historical and incoming data
 
WD introduces Black 4TB desktop drive

by Geoff Gasior — 7:00 AM on November 20, 2012
WD introduces Black 4TB desktop drive - The Tech Report

Western Digital hasn't bumped up the capacity of its high-end Black mechanical hard drive since the 2TB model debuted over three years ago. That older drive was eventually replaced by an upgraded variant with a 6Gbps SATA interface, but the total capacity remained unchanged. Today, there's a new addition with twice the storage: the Black 4TB.

Aside from doubling the capacity of its predecessor, the new Black has much in common with existing members of the family. The spindle speed is 7,200 RPM, the interface is 6Gbps SATA, and the DRAM cache is 64MB. The Black 4TB also features a dual-stage actuator whose design originated in the Black 2TB. This mechanism uses a second arm to move the drive head with additional precision, helping to keep
 
Quantum cryptography done on standard broadband fibre

BBC News - Quantum cryptography done on standard broadband fibre

The "uncrackable codes" made by exploiting the branch of physics called quantum mechanics have been sent down kilometres of standard broadband fibre.

This "quantum key distribution" has until now needed a dedicated fibre separate from that used to carry data.

But a new technique reported in Physical Review X shows how to unpick normal data streams from the much fainter, more delicate quantum signal.

It may see the current best encryption used in many businesses and even homes.

The quantum key distribution or QKD idea is based on the sharing of a key between two parties - a small string of data that can be used as the basis for encoding much larger amounts.
 
America’s Titan Surpasses Sequoia as World’s Fastest Supercomputer
America’s Titan Surpasses Sequoia as World’s Fastest Supercomputer | Singularity Hub

Last week, a throng of computer geeks descended on snowy Utah to show off, admire, and debate the future of the fastest computers on the planet. And of course, to find out which Boolean monster rules the roost. For the second time in 2012, a different supercomputer took top honors: Titan.

Titan is a Cray XK7 residing in Oak Ridge National Laboratory (ORNL). According to the November Top500 list of most powerful supercomputers, the system notched 17.59 petaFLOP/s (floating point operations per second) as measured by the Linpack Benchmark. The previous mark of 16.32 petaFLOP/s was held by Lawrence Livermore Laboratory’s BlueGene/Q system, Sequoia.

While Titan is a new name, it is not an entirely new computer.

The system is a souped up version of ORNL’s previous Top500 list champ, Jaguar (November 2009 to June 2010). Although Titan occupies the same footprint and consumes about the same power as Jaguar, it is almost ten times faster. A mark that won Titan third on the Green500 list of most power efficient machines.

And that’s really a key point. Measuring and comparing speed is exciting, but the future of supercomputing depends on efficiency gains too.

In the last decade or so, engineers have increased power by building massively parallel systems. That is, engineers have been linking more and more processors and stuffing them into tighter and tighter spaces.

Indeed, the previous record holder, Sequoia, packs over 1.6 million processing cores. And to get from Jaguar to Titan, engineers increased the number of processing cores per node from 12 to 16—for a total of over 560,000.

But simply increasing the number of cores isn’t scalable long term—practically or economically. According to ORNL computational scientist Jim Hack, “We have ridden the current architecture about as far as we can.”

Jaguar was powered by 300,000 processors in 200 cabinets—upping performance by ten (as Titan has) would have required “2,000 cabinets on the floor, consume 10 times the power, et cetera.”

Titan is ten times speedier—but is also about the same size and requires the same amount of power as Jaguar. The system’s superior efficiency is thanks in part to an improved Cray system interconnect, upping communication volume and efficiency between processors. But Titan makes use of another increasingly popular strategy.

Instead of relying on traditional CPUs to do all the heavy lifting and higher level decision-making and communication, engineers are taking a page from high performance gaming. Installed in each of Titan’s 16-core nodes is an NVIDIA K20X Graphics Processing Unit (GPU).

The GPU co-processor serves as the system workhorse—not terribly bright, but immensely powerful.

GPUs handle all the really heavy computational work superfast, leaving the CPUs to direct traffic. This specialization realizes some pretty awesome efficiency gains without sacrificing speed to get there.


NVIDIA Tesla K20X GPU co-processor.

Beyond Titan, China’s Tianhe-1A—Top500 champ in 2010, now #8 on the list—uses GPU co-processors. In fact, 62 of the top 500 supercomputers use co-processors, up from 58 systems in June.

Chipmakers are taking note too.

Three big players revealed ultra high performance GPUs in November—NVIDIA’s Tesla K20X (utilized in Titan), Intel’s Xeon Phi (Knight’s Corner), and AMD’s FirePro S10000.

Supercomputers aren’t just getting speedier. They’re getting more speed per unit of power and space too. Maybe in the not too terribly distant future, we’ll remember today’s giant, power hungry machines with an incredulous shake of the head. Like those old room-sized Crays that now fit into a laptop
 
Fast forward to the past: Technologists test 'game-changing' data-processing technology

Analog-Based Microchip

The new technology is an analog-based microchip developed with significant support from the Defense Advanced Research Projects Agency (DARPA). Instead of relying on tiny switches or transistors that turn on and off, producing streams of ones and zeroes that computing systems then translate into something meaningful to users, the company's new microchip is more like a dimmer switch. It can accept inputs and calculate outputs that are between zero and one, directly representing probabilities, or levels of certainty.

"The technology is fundamentally different from standard digital-signal processing, recognizing values between zero and one to accomplish what would otherwise be cost prohibitive or impossible with traditional digital circuits," Pellish said.

The processor's enhanced performance is due to the way the technology works, he explained. While digital systems use processors that step through calculations one at a time, in a serial fashion, the new processor uses electronic signals to represent probabilities rather than binary ones and zeros. It then effectively runs the calculations in parallel. Where it might take 500 transistors for a digital computer to calculate a probability, the new technology would take just a few. In other words, the microchip can perform a calculation more efficiently, with fewer circuits and less power than a digital processor—attributes important for space- and power-constrained spacecraft instruments, Pellish said.

Fast forward to the past: Technologists test 'game-changing' data-processing technology


------
New computerized approach could revolutionize design and manufacturing
November 27, 2012

http://phys.org/news/2012-11-computerized-approach-revolutionize.html

Engineers at Oregon State University and other leading institutions have made important advances that may dramatically change how machines get built, with a concept that could turn the approaches used by modern industry into a historic relic.


They will essentially throw out the old "design it, build a prototype and test it, then fix the mistakes and test it some more" method that's been in place since the dawn of the Industrial Revolution. Approaches that worked for Robert Fulton or Henry Ford are now considered too expensive, wasteful, unpredictable and time-consuming.

Instead, virtually all of the design, testing, error identification and revisions will be done on a computer up to the point of commercial production. In theory, a new machine should work right the first time, and perform exactly as the computer said it would.

"If this works, and we believe it will, then it will revolutionize the way that machines get built," said Irem Tumer, an associate professor in OSU's School of Mechanical, Industrial and Manufacturing Engineering.

"The field holds great promise to design and test completed machines on a computer before they are ever built," she said. "We'll see what works, identify and solve problems, make any changes desired, and then go straight to commercial production."

The concept is called "model based design and verification," and is getting initial impetus from a design challenge sponsored by the U.S. military, which wants a new amphibious vehicle in about one-fifth of the time it would ordinarily take to build it. They also want lower cost and excellent performance.

"You can understand why our armed forces are interested in this," she said. "They want to speed production of needed military vehicles by five times over the conventional approach, which is a pretty aggressive goal. For them, it's about saving money, saving time, and ultimately producing technology that helps to save lives."

After that, Tumer said, the systems could be used anywhere. There's little downside to producing cars, aircraft, or new industrial machines that work right the first time, cost less and get produced more quickly.


---

GM cars to use Apple voice assistant Siri
November 27, 2012
http://phys.org/news/2012-11-gm-cars-apple-voice-siri.html

General Motors said Tuesday it will integrate Apple's voice-activated software Siri in some of its cars next year to allow iPhone users to perform hands-free tasks.

GM announced at the Los Angeles International Auto Show it will use the Siri intelligent assistant in the Chevrolet Spark and Sonic LTZ and RS.

"Through the cars' standard Chevrolet MyLink infotainment system, customers with a compatible iPhone running iOS 6 can direct Siri to perform a number of tasks while they safely keep their eyes on the road and their hands on the wheel," GM said in a statement.
 
Last edited:
Tiny USB packs 1TB of storage in Swiss Army Knife body



Tiny USB packs 1TB of storage in Swiss Army Knife body
It's common to see USB thumb drives in relatively large capacities these days, with 8GB, 16GB, and 32GB sizes fairly de rigeur. But what to do if your portable storage needs are larger much, much larger?


Victorinox, the company best known for its popular line of convenient multi-bladed Swiss Army Knives, has a solution for the storage-hungry among you. Its new Victorinox SSD product delivers an amazing 1TB of storage in a container the size of your average USB thumb drive or, more literally, the size of a Swiss Army Knife.

The technology inside is actually a Solid State Drive — the same type of fast and light drive found in ultra-slim notebooks like the MacBook Air. Only this one is much smaller than your typical SSD — tiny enough to actually fit inside a pocketknife. The drive also comes loaded with a combination of both hardware and software encryption, making all that valuable data highly secure and safe from prying eyes.
 
Taiwan engineers defeat limits of flash memory
December 2, 2012 by Nancy Owano

Taiwan engineers defeat limits of flash memory

(Phys.org)—Taiwan-based Macronix has found a solution for a weakness in flash memory fadeout. A limitation of flash memory is simply that eventually it cannot be used; the more cells in the memory chips are erased, the less useful to store data. The write-erase cycles degrade insulation; eventually the cell fails. "Flash wears out after being programmed and erased about 10,000 times," said the IEEE Spectrum. Engineers at Macronix have a solution that moves flash memory over to a new life. They propose a "self-healing" NAND flash memory solution that can survive over 100 million cycles.



News of their findings appears in the IEEE Spectrum, discussing flash memory's limitations and the Taiwan company's solution. Macronix is a manufacturer in the Non-Volatile Memory (NVM) market, with a NOR Flash, NAND Flash, and ROM products. Before their solution announcement, though, many engineers inside and outside of Macronix were aware of a life-giving workaround: heat. The snag is that applying heat was not found to be practical. As the Macronix team put it, the "long baking time is impractical for real time operation." Although subjecting the cells to high heat could return memory, the process was problematic; the entire memory chip would need heating for hours at around 250 °C.


They redesigned a flash memory chip to include onboard heaters to anneal small groups of memory cells. Applying a brief jolt of heat to a very restricted area within the chip (800 degrees C) returns the cell to a "good" state. They said that the process does not have to be run all that often. According to project member Hang‑Ting Lue, the annealing can be done infrequently and on one sector at a time while the device is inactive but still connected to the power source. It would not drain a cellphone battery, he added.


Macronix estimates that the flash memory cells could beat the 10,000 cycle limit by lasting for as much as for 100 million cycles but a commercial product is not imminent. Instead, Macronix will present their approach—very high temperature in a very short time— this month at the IEEE International Electron Devices Meeting (IEDM) from December 10 to 12 in San Francisco. This is the forum for presenting breakthroughs in semiconductor and electronic device technology. Lue observed that in coming up with the approach, his team would not be able to lay claim to any new physics principle. "We could have done this ten years ago." He said it took merely a leap of imagination into a different "regime."
 
Technology makes smartphone screens feel like keyboards
Technology makes smartphone screens feel like keyboards | DVICE

Kyocera has created a touchscreen that, when you press buttons on it, it feels like you're actually pushing a physical button. You can even choose between making it feel like a button or a soft-touch keyboard.

It works by using piezoelectricity (i.e. electricity resulting from pressure) to create an extremely high-speed, intense little vibration where you touch the keyboard. Thus, it feels like you've actually touched something.

While this might not seem like a big step, it can actually be pretty helpful: typing accuracy will go nowhere but up, and it'll probably open up some pretty neat gaming opportunities.

Cool, this will help tablets gain more of the laptop market.




Kyocera demonstrates new touchscreen that feels like it has physical keys

November 8, 2012 By Andy Boxall
http://www.digitaltrends.com/mobile/kyocera-demonstrates-new-tactile-touchscreen/


Kyocera has demonstrated a touchscreen that it's claimed, feels like touching a physical keyboard. Using similar technology to haptic feedback, it can be adjusted to provide a different feel depending on what you're touching on screen.

Japanese technology firm Kyocera has come up with a new touchscreen, which it demonstrated at the Digital Contents Expo in Tokyo last week. It uses similar technology to that which gives us haptic feedback, and when described, may remind you of Research in Motion’s SurePress technology seen on its Storm phones.
 
Last edited:
Research could lead to more efficient integrated circuits

5 December 2012

Engineers in the US have fabricated transistors with 20nm gates — a development that could lead to faster, more compact and more efficient integrated circuits.

‘It’s a preview of things to come in the semiconductor industry,’ said Peide ‘Peter’ Ye, a professor of electrical and computer engineering at Purdue University.

Researchers from Purdue and Harvard universities created the transistors with indium-gallium-arsenide, a material that could replace silicon within a decade. Each transistor contains three tiny nanowires made from the material that are progressively smaller, yielding a tapered cross section that resembles a Christmas tree.

Read more: Research could lead to more efficient integrated circuits | News | The Engineer
 
DNP unveils new plastic cover sheet for handheld devices

(Phys.org)—Japanese AV equipment maker Dai Nippon Printing (DNP) has unveiled a new type of plastic cover sheet that may soon replace glass screen covers on smartphones and tablet computers. The new material is reportedly as hard as gorilla glass and is extremely resistant to scratching. And because it's made of plastic, can also be bent.

The sheet is actually a sandwich of three components: a fingerprint proof layer on the front and a plastic substrate coated with a hardening agent on the back. The result is a sheet of plastic just 0.5mm thick that can be manufactured in virtually any size. DNP says testing of the new plastic cover sheet has shown it to have a pencil hardness of 9H, which is comparable to the gorilla glass now commonly used as a cover sheet for the screens on most handheld electronic devices. They also brushed samples with steel wool with a weight of 500g/cm2 applied, 200 times and found no scratching had occurred.

The company says the new covers would be a superior alternative to glass cover sheets because they would be far less susceptible to scratching and breaking. The team also tested the bendability of the cover sheet using a mandrel test on 1.0mm and 0.5mm thick sheets and found them to be 140mm and 90mm diameters, respectively. Thus, the new covers would be suitable for use in bendable smartphones and/or tablet computers.

Read more at: DNP unveils new plastic cover sheet for handheld devices
 
Breakthrough in augmented reality contact lens: Curved LCD display holds widespread potential

The contact lens display with the dollar sign held next to a human eye. The Centre of Microsystems Technology (CMST), Imec's associated laboratory at Ghent University (Belgium), announced today it has developed an innovative spherical curved LCD display, which can be embedded in contact lenses. The first step toward fully pixelated contact lens displays, this achievement has potential wide-spread applications in medical and cosmetic domains.

Unlike LED-based contact lens displays, which are limited to a few small pixels, imec's innovative LCD-based technology permits the use of the entire display surface. By adapting the patterning process of the conductive layer, this technology enables applications with a broad range of pixel number and sizes, such as a one pixel, fully covered contact lens acting as adaptable sunglasses, or a highly pixelated contact lens display.

Read more at: Breakthrough in augmented reality contact lens: Curved LCD display holds widespread potential
 
Last edited:
Mushkin unveils world’s first 480GB mSATA solid state drive
Brittany Hillen, Dec 6th 2012 Discuss [0]

Mushkin has unveiled what it claims to be the world’s first 480GB mSATA SSD. The device is part of the Atlas line, and joins Mushkin’s large catalog of SSD offerings, which includes the Callisto, Catalyst, and Chronos lines. The 480GB SSD is slated for release in January for $499.99, which prices it at a little over $1 per gigabyte.

Keeping z-height as low as possible and managing to fit eight NAND flash chips and a controller on a mSATA PCB was no easy feat, but now capacity-hungry Ultrabook and notebook users can go beyond the 256GB mSATA barrier
 

Forum List

Back
Top