Don t Let Anybody Tell You That Businesses Create Jobs

Incidentally onepercenter, if spending creates jobs, whatever happened to the United States dominating steel industry? This country was the biggest industrial manufacturer of the world's steel in the 1940's. Where did all those jobs go? Did the demand to spend on steel for our bridges and skyscrapers suddenly stop?

Sorry, your belief that jobs are solely linked and dependent upon spending is incorrect.
 
Right, money taken from someone else.

I use tax supported services. I believe it's correct in paying for what I use. I take it you don't feel that way.

Well the tax supported school you attended....didn't do so well.

That could be why we so desperately need a 'no child left behind' program to help rate the school's quality of education, and aid those who need that extra help to graduate
 
You posted:

Corporate America has gotten away with mass murder.

Employees make all the money for an employer. How much money should they make?

And:

The employer provides the platform. The employee does all of the work. The employer collects the monies.

"Employees make all the money for an employer."

This implies that the employer makes none of the money for themselves.

You then go on to refute that idea with: "The employer provides the platform."

In other words, the employer provides everything necessary to make the product except the labor itself: tools, tech, building, raw materials.

" The employer collects the monies"

Implying that the employee collects none.

Which is a lie.

The employee collects compensation for his labor. Compensation that he agreed to before hand. He is paid for that which he has put into the product according to the terms of his agreement with his employer.

What you are trying to say though, is that he should be paid a share of the profits resulting from the sale of something that he never owned and could not have made without the investment from the employer in addition to the pay he already receives for his time and effort.

Is that it?
 
Steve Jobs didn't develop the iPad, his employees did.

The LED and computer microchip were developed by NASA.

Wrong, they were developed by Bell Labs and Texas Instruments.

WTF, do either of you know how Google works??

[Excerpt]
Electroluminescence as a phenomenon was discovered in 1907 by the British experimenter H. J. Round of Marconi Labs, using a crystal of silicon carbide and a cat's-whisker detector.[11][12] Russian Oleg Losev reported creation of the first LED in 1927.[13] His research was distributed in Russian, German and British scientific journals, but no practical use was made of the discovery for several decades.[14][15] Rubin Braunstein[16] of the Radio Corporation of America reported on infrared emission from gallium arsenide (GaAs) and other semiconductor alloys in 1955.[17] Braunstein observed infrared emission generated by simple diode structures using gallium antimonide (GaSb), GaAs, indium phosphide (InP), and silicon-germanium (SiGe) alloys at room temperature and at 77 kelvins.

In 1957, Braunstein further demonstrated that the rudimentary devices could be used for non-radio communication across a short distance. As noted by Kroemer[18] Braunstein".. had set up a simple optical communications link: Music emerging from a record player was used via suitable electronics to modulate the forward current of a GaAs diode. The emitted light was detected by a PbS diode some distance away. This signal was fed into an audio amplifier, and played back by a loudspeaker. Intercepting the beam stopped the music. We had a great deal of fun playing with this setup." This setup presaged the use of LEDs for optical communication applications.


Diagram of the tunnel diode constructed on a zinc diffused area of gallium arsenide semi-insulating substrate
In the fall of 1961, while working at Texas Instruments Inc. in Dallas, TX, James R. Biard and Gary Pittman found that gallium arsenide (GaAs) emitted infrared light when electric current was applied. On August 8, 1962, Biard and Pittman filed a patent titled "Semiconductor Radiant Diode" based on their findings, which described a zinc diffused p–n junction LED with a spaced cathode contact to allow for efficient emission of infrared light under forward bias.
[/Excerpt]


RCA would be before both of your time's ...............................
TI ring any bells??

[Excerpt]
History
The advent of low-cost computers on integrated circuits has transformed modern society. General-purpose microprocessors in personal computers are used for computation, text editing, multimedia display, and communication over the Internet. Many more microprocessors are part of embedded systems, providing digital control over myriad objects from appliances to automobiles to cellular phones and industrial process control.

The first use of the term "microprocessor" is attributed to Viatron Computer Systems describing the custom integrated circuit used in their System 21 small computer system announced in 1968.

Intel introduced its first 4-bit microprocessor 4004 in 1971 and its 8-bit microprocessor 8008 in 1972. During the 1960s, computer processors were constructed out of small and medium-scale ICs—each containing from tens of transistors to a few hundred. These were placed and soldered onto printed circuit boards, and often multiple boards were interconnected in a chassis. The large number of discrete logic gates used more electrical power—and therefore produced more heat—than a more integrated design with fewer ICs. The distance that signals had to travel between ICs on the boards limited a computer's operating speed.

In the NASA Apollo space missions to the moon in the 1960s and 1970s, all onboard computations for primary guidance, navigation and control were provided by a small custom processor called "The Apollo Guidance Computer". It used wire wrap circuit boards whose only logic elements were three-input NOR gates.[8]

The first microprocessors emerged in the early 1970s and were used for electronic calculators, using binary-coded decimal (BCD) arithmetic on 4-bit words. Other embedded uses of 4-bit and 8-bit microprocessors, such as terminals, printers, various kinds of automation etc., followed soon after. Affordable 8-bit microprocessors with 16-bit addressing also led to the first general-purpose microcomputers from the mid-1970s on.

Since the early 1970s, the increase in capacity of microprocessors has followed Moore's law; this originally suggested that the number of components that can be fitted onto a chip doubles every year. With present technology, it is actually every two years,[9] and as such Moore later changed the period to two years.[10]
[/Excerpt]

Never heard of Viatron eithre right??

[Excerpt]
Viatron
From Wikipedia, the free encyclopedia


Viatron Computer Systems, or simply Viatron was an American computer company headquartered in Bedford, Massachusetts, and later Burlington, Massachusetts. Viatron may have coined the term "microprocessor".[1]

Viatron was founded in 1967 by engineers from Mitre Corporation led by Dr. Edward M. Bennett and Joseph Spiegel. In 1968 the company announced its System 21 small computer system together with its intention to lease the systems starting at a revolutionary price of $40 per month. The basic system included a microprocessor with 512 characters of read/write RAM memory, a keyboard, a 9-inch (23 cm) CRT display and two cartridge tape drives.[2]

The system specifications, advanced for 1968 – five years before the advent of the first commercial personal computers – caused a lot of excitement in the computer industry. The System 21 was aimed, among others, at applications such as mathematical and statistical analysis, business data processing, data entry and media conversion, and educational/classroom use.

The expectation was that the use of new large scale integrated circuit technology (LSI) and volume would enable Viatron to be successful at lower margins, however the prototype did not incorporate LSI technology. In 1960 Bennett claimed that by 1972 Viatron would have delivered more "digital machines" than had "previously been installed by all computer makers." He declared "We want to turn out computers like GM turns out Chevvies,"[3]

The semiconductor industry was unable to produce circuits in the volumes required, forcing Viatron to sell fewer than the planned 5,000–6,000 systems per month. This raised the production costs per unit and prevented the company from ever achieving profitability.

Bennett was fired in 1970, and the company declared Chapter XI bankruptcy in 1971.[1]
[/Excerpt]
 
Last edited:
Right, money taken from someone else.

I use tax supported services. I believe it's correct in paying for what I use. I take it you don't feel that way.

Well the tax supported school you attended....didn't do so well.

That could be why we so desperately need a 'no child left behind' program to help rate the school's quality of education, and aid those who need that extra help to graduate

NO CHILD LEFT BEHIND was actually NO CHILD ALLOWED AHEAD.

It reduces the pace of the classroom to the speed of the 504's and Spec Ed kids placed in "the least restrictive environment".
 
Steve Jobs didn't develop the iPad, his employees did.

The LED and computer microchip were developed by NASA.

Wrong, they were developed by Bell Labs and Texas Instruments.

WTF, do either of you know how Google works??

[Excerpt]
Electroluminescence as a phenomenon was discovered in 1907 by the British experimenter H. J. Round of Marconi Labs, using a crystal of silicon carbide and a cat's-whisker detector.[11][12] Russian Oleg Losev reported creation of the first LED in 1927.[13] His research was distributed in Russian, German and British scientific journals, but no practical use was made of the discovery for several decades.[14][15] Rubin Braunstein[16] of the Radio Corporation of America reported on infrared emission from gallium arsenide (GaAs) and other semiconductor alloys in 1955.[17] Braunstein observed infrared emission generated by simple diode structures using gallium antimonide (GaSb), GaAs, indium phosphide (InP), and silicon-germanium (SiGe) alloys at room temperature and at 77 kelvins.

In 1957, Braunstein further demonstrated that the rudimentary devices could be used for non-radio communication across a short distance. As noted by Kroemer[18] Braunstein".. had set up a simple optical communications link: Music emerging from a record player was used via suitable electronics to modulate the forward current of a GaAs diode. The emitted light was detected by a PbS diode some distance away. This signal was fed into an audio amplifier, and played back by a loudspeaker. Intercepting the beam stopped the music. We had a great deal of fun playing with this setup." This setup presaged the use of LEDs for optical communication applications.


Diagram of the tunnel diode constructed on a zinc diffused area of gallium arsenide semi-insulating substrate
In the fall of 1961, while working at Texas Instruments Inc. in Dallas, TX, James R. Biard and Gary Pittman found that gallium arsenide (GaAs) emitted infrared light when electric current was applied. On August 8, 1962, Biard and Pittman filed a patent titled "Semiconductor Radiant Diode" based on their findings, which described a zinc diffused p–n junction LED with a spaced cathode contact to allow for efficient emission of infrared light under forward bias.
[/Excerpt]


RCA would be before both of your time's ...............................
TI ring any bells??

[Excerpt]
History
The advent of low-cost computers on integrated circuits has transformed modern society. General-purpose microprocessors in personal computers are used for computation, text editing, multimedia display, and communication over the Internet. Many more microprocessors are part of embedded systems, providing digital control over myriad objects from appliances to automobiles to cellular phones and industrial process control.

The first use of the term "microprocessor" is attributed to Viatron Computer Systems describing the custom integrated circuit used in their System 21 small computer system announced in 1968.

Intel introduced its first 4-bit microprocessor 4004 in 1971 and its 8-bit microprocessor 8008 in 1972. During the 1960s, computer processors were constructed out of small and medium-scale ICs—each containing from tens of transistors to a few hundred. These were placed and soldered onto printed circuit boards, and often multiple boards were interconnected in a chassis. The large number of discrete logic gates used more electrical power—and therefore produced more heat—than a more integrated design with fewer ICs. The distance that signals had to travel between ICs on the boards limited a computer's operating speed.

In the NASA Apollo space missions to the moon in the 1960s and 1970s, all onboard computations for primary guidance, navigation and control were provided by a small custom processor called "The Apollo Guidance Computer". It used wire wrap circuit boards whose only logic elements were three-input NOR gates.[8]

The first microprocessors emerged in the early 1970s and were used for electronic calculators, using binary-coded decimal (BCD) arithmetic on 4-bit words. Other embedded uses of 4-bit and 8-bit microprocessors, such as terminals, printers, various kinds of automation etc., followed soon after. Affordable 8-bit microprocessors with 16-bit addressing also led to the first general-purpose microcomputers from the mid-1970s on.

Since the early 1970s, the increase in capacity of microprocessors has followed Moore's law; this originally suggested that the number of components that can be fitted onto a chip doubles every year. With present technology, it is actually every two years,[9] and as such Moore later changed the period to two years.[10]
[/Excerpt]

Never heard of Viatron eithre right??

[Excerpt]
Viatron
From Wikipedia, the free encyclopedia


Viatron Computer Systems, or simply Viatron was an American computer company headquartered in Bedford, Massachusetts, and later Burlington, Massachusetts. Viatron may have coined the term "microprocessor".[1]

Viatron was founded in 1967 by engineers from Mitre Corporation led by Dr. Edward M. Bennett and Joseph Spiegel. In 1968 the company announced its System 21 small computer system together with its intention to lease the systems starting at a revolutionary price of $40 per month. The basic system included a microprocessor with 512 characters of read/write RAM memory, a keyboard, a 9-inch (23 cm) CRT display and two cartridge tape drives.[2]

The system specifications, advanced for 1968 – five years before the advent of the first commercial personal computers – caused a lot of excitement in the computer industry. The System 21 was aimed, among others, at applications such as mathematical and statistical analysis, business data processing, data entry and media conversion, and educational/classroom use.

The expectation was that the use of new large scale integrated circuit technology (LSI) and volume would enable Viatron to be successful at lower margins, however the prototype did not incorporate LSI technology. In 1960 Bennett claimed that by 1972 Viatron would have delivered more "digital machines" than had "previously been installed by all computer makers." He declared "We want to turn out computers like GM turns out Chevvies,"[3]

The semiconductor industry was unable to produce circuits in the volumes required, forcing Viatron to sell fewer than the planned 5,000–6,000 systems per month. This raised the production costs per unit and prevented the company from ever achieving profitability.

Bennett was fired in 1970, and the company declared Chapter XI bankruptcy in 1971.[1]
[/Excerpt]

I said the Integrated Circuit, moron, not the micro processor:

Integrated circuit - Wikipedia the free encyclopedia

A precursor idea to the IC was to create small ceramic squares (wafers), each one containing a single miniaturized component. Components could then be integrated and wired into a bidimensional or tridimensional compact grid. This idea, which looked very promising in 1957, was proposed to the US Army by Jack Kilby, and led to the short-lived Micromodule Program (similar to 1951's Project Tinkertoy).[9] However, as the project was gaining momentum, Kilby came up with a new, revolutionary design: the IC.



Jack Kilby's original integrated circuit

Newly employed by Texas Instruments, Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958.[10] In his patent application of 6 February 1959,[11] Kilby described his new device as “a body of semiconductor material … wherein all the components of the electronic circuit are completely integrated.”[12] The first customer for the new invention was the US Air Force.[13]


Kilby won the 2000 Nobel Prize in Physics for his part of the invention of the integrated circuit.[14] Kilby's work was named an IEEE Milestone in 2009.[15]
 
1% used the word "microchip" ...................
Please quote where you use that phrase "integrated circuit" @ ................
 
1% used the word "microchip" ...................
Please quote where you use that phrase "integrated circuit" @ ................

"Microchip" is the same thing as an Integrated Circuit. However, "micro-processor" is a computer on an IC. It's not synonymous with the term IC. Note that the IC was wasn't invented until 1971. That's 3 years after we landed on the moon.
 
His employees aren't smart enough to dream up and develop the iPad, they only followed after his idea. Remember, the development of the apple computer began with two guys in a garage pursuing after an idea. That business empire, which would compete with Microsoft, would later employ thousands of jobs, all thanks to someone's creative innovation.

Apple employees aren't smart enough.......that's funny!!!!! Seems neither is Jobs.

'A full decade before Jobs launched the iPad in 2010, Bill Gates launched Microsoft's touch input tablet computer.'

Read more: Microsoft invented the tablet before Apple - Business Insider
 
Incidentally onepercenter, if spending creates jobs, whatever happened to the United States dominating steel industry? This country was the biggest industrial manufacturer of the world's steel in the 1940's. Where did all those jobs go? Did the demand to spend on steel for our bridges and skyscrapers suddenly stop?

Sorry, your belief that jobs are solely linked and dependent upon spending is incorrect.

I never stated 'solely.' Japan product dumping (Nixon). Whenever the working class gets fucked, you can find a Republican responsible.
 
You posted:

Corporate America has gotten away with mass murder.

Employees make all the money for an employer. How much money should they make?

And:

The employer provides the platform. The employee does all of the work. The employer collects the monies.

"Employees make all the money for an employer."

This implies that the employer makes none of the money for themselves.

You then go on to refute that idea with: "The employer provides the platform."

In other words, the employer provides everything necessary to make the product except the labor itself: tools, tech, building, raw materials.

" The employer collects the monies"

Implying that the employee collects none.

Which is a lie.

The employee collects compensation for his labor. Compensation that he agreed to before hand. He is paid for that which he has put into the product according to the terms of his agreement with his employer.

What you are trying to say though, is that he should be paid a share of the profits resulting from the sale of something that he never owned and could not have made without the investment from the employer in addition to the pay he already receives for his time and effort.

Is that it?

Employees DO make all of the money for the company.

If you employ, why would you make any of the money?

A platform is a place (actual or virtual) and rules and direction. Everything else is day-to-day operation which is (should be) employee controlled.

An employer DOES collect the money. It goes into his/her account.

Does an employee receive compensation? Yes

Does an employee receive a percentage? Commission employees do. Employee owned corporations do (kinda of).

Anything else?
 
Steve Jobs didn't develop the iPad, his employees did.

The LED and computer microchip were developed by NASA.

Wrong, they were developed by Bell Labs and Texas Instruments.

WTF, do either of you know how Google works??

[Excerpt]
Electroluminescence as a phenomenon was discovered in 1907 by the British experimenter H. J. Round of Marconi Labs, using a crystal of silicon carbide and a cat's-whisker detector.[11][12] Russian Oleg Losev reported creation of the first LED in 1927.[13] His research was distributed in Russian, German and British scientific journals, but no practical use was made of the discovery for several decades.[14][15] Rubin Braunstein[16] of the Radio Corporation of America reported on infrared emission from gallium arsenide (GaAs) and other semiconductor alloys in 1955.[17] Braunstein observed infrared emission generated by simple diode structures using gallium antimonide (GaSb), GaAs, indium phosphide (InP), and silicon-germanium (SiGe) alloys at room temperature and at 77 kelvins.

In 1957, Braunstein further demonstrated that the rudimentary devices could be used for non-radio communication across a short distance. As noted by Kroemer[18] Braunstein".. had set up a simple optical communications link: Music emerging from a record player was used via suitable electronics to modulate the forward current of a GaAs diode. The emitted light was detected by a PbS diode some distance away. This signal was fed into an audio amplifier, and played back by a loudspeaker. Intercepting the beam stopped the music. We had a great deal of fun playing with this setup." This setup presaged the use of LEDs for optical communication applications.


Diagram of the tunnel diode constructed on a zinc diffused area of gallium arsenide semi-insulating substrate
In the fall of 1961, while working at Texas Instruments Inc. in Dallas, TX, James R. Biard and Gary Pittman found that gallium arsenide (GaAs) emitted infrared light when electric current was applied. On August 8, 1962, Biard and Pittman filed a patent titled "Semiconductor Radiant Diode" based on their findings, which described a zinc diffused p–n junction LED with a spaced cathode contact to allow for efficient emission of infrared light under forward bias.
[/Excerpt]


RCA would be before both of your time's ...............................
TI ring any bells??

[Excerpt]
History
The advent of low-cost computers on integrated circuits has transformed modern society. General-purpose microprocessors in personal computers are used for computation, text editing, multimedia display, and communication over the Internet. Many more microprocessors are part of embedded systems, providing digital control over myriad objects from appliances to automobiles to cellular phones and industrial process control.

The first use of the term "microprocessor" is attributed to Viatron Computer Systems describing the custom integrated circuit used in their System 21 small computer system announced in 1968.

Intel introduced its first 4-bit microprocessor 4004 in 1971 and its 8-bit microprocessor 8008 in 1972. During the 1960s, computer processors were constructed out of small and medium-scale ICs—each containing from tens of transistors to a few hundred. These were placed and soldered onto printed circuit boards, and often multiple boards were interconnected in a chassis. The large number of discrete logic gates used more electrical power—and therefore produced more heat—than a more integrated design with fewer ICs. The distance that signals had to travel between ICs on the boards limited a computer's operating speed.

In the NASA Apollo space missions to the moon in the 1960s and 1970s, all onboard computations for primary guidance, navigation and control were provided by a small custom processor called "The Apollo Guidance Computer". It used wire wrap circuit boards whose only logic elements were three-input NOR gates.[8]

The first microprocessors emerged in the early 1970s and were used for electronic calculators, using binary-coded decimal (BCD) arithmetic on 4-bit words. Other embedded uses of 4-bit and 8-bit microprocessors, such as terminals, printers, various kinds of automation etc., followed soon after. Affordable 8-bit microprocessors with 16-bit addressing also led to the first general-purpose microcomputers from the mid-1970s on.

Since the early 1970s, the increase in capacity of microprocessors has followed Moore's law; this originally suggested that the number of components that can be fitted onto a chip doubles every year. With present technology, it is actually every two years,[9] and as such Moore later changed the period to two years.[10]
[/Excerpt]

Never heard of Viatron eithre right??

[Excerpt]
Viatron
From Wikipedia, the free encyclopedia


Viatron Computer Systems, or simply Viatron was an American computer company headquartered in Bedford, Massachusetts, and later Burlington, Massachusetts. Viatron may have coined the term "microprocessor".[1]

Viatron was founded in 1967 by engineers from Mitre Corporation led by Dr. Edward M. Bennett and Joseph Spiegel. In 1968 the company announced its System 21 small computer system together with its intention to lease the systems starting at a revolutionary price of $40 per month. The basic system included a microprocessor with 512 characters of read/write RAM memory, a keyboard, a 9-inch (23 cm) CRT display and two cartridge tape drives.[2]

The system specifications, advanced for 1968 – five years before the advent of the first commercial personal computers – caused a lot of excitement in the computer industry. The System 21 was aimed, among others, at applications such as mathematical and statistical analysis, business data processing, data entry and media conversion, and educational/classroom use.

The expectation was that the use of new large scale integrated circuit technology (LSI) and volume would enable Viatron to be successful at lower margins, however the prototype did not incorporate LSI technology. In 1960 Bennett claimed that by 1972 Viatron would have delivered more "digital machines" than had "previously been installed by all computer makers." He declared "We want to turn out computers like GM turns out Chevvies,"[3]

The semiconductor industry was unable to produce circuits in the volumes required, forcing Viatron to sell fewer than the planned 5,000–6,000 systems per month. This raised the production costs per unit and prevented the company from ever achieving profitability.

Bennett was fired in 1970, and the company declared Chapter XI bankruptcy in 1971.[1]
[/Excerpt]

• The Computer Microchip: Modern microchips descend from integrated circuits used in the Apollo Guidance Computer.

• Light-Emitting Diodes (LED): Developed by NASA,the red light-emitting diodes were used to grow plants in space. Later this technology was developed in medical devices for muscle pain relief/relaxation, joint pain, arthritis and muscle spasms. Later generations of the technology are used to combat the symptoms of bone atrophy, multiple sclerosis, diabetic complications and Parkinson's disease.

LAOROSA DESIGN-JUNKY 26 NASA Inventions That We Take For Granted Everyday...
 
15th post
His employees aren't smart enough to dream up and develop the iPad, they only followed after his idea. Remember, the development of the apple computer began with two guys in a garage pursuing after an idea. That business empire, which would compete with Microsoft, would later employ thousands of jobs, all thanks to someone's creative innovation.

Apple employees aren't smart enough.......that's funny!!!!! Seems neither is Jobs.

'A full decade before Jobs launched the iPad in 2010, Bill Gates launched Microsoft's touch input tablet computer.'

Read more: Microsoft invented the tablet before Apple - Business Insider

The difference is Jobs made one that sells. That still doesn't prove that Apple employees are smart enough, perseverent enough and bold enough to launch a product like the iPad.
 
Funny how Hillary herself backtracked on the whole "businesses don't create jobs" garbage, but people here won't.
 
Steve Jobs didn't develop the iPad, his employees did.

The LED and computer microchip were developed by NASA.

Wrong, they were developed by Bell Labs and Texas Instruments.

WTF, do either of you know how Google works??

[Excerpt]
Electroluminescence as a phenomenon was discovered in 1907 by the British experimenter H. J. Round of Marconi Labs, using a crystal of silicon carbide and a cat's-whisker detector.[11][12] Russian Oleg Losev reported creation of the first LED in 1927.[13] His research was distributed in Russian, German and British scientific journals, but no practical use was made of the discovery for several decades.[14][15] Rubin Braunstein[16] of the Radio Corporation of America reported on infrared emission from gallium arsenide (GaAs) and other semiconductor alloys in 1955.[17] Braunstein observed infrared emission generated by simple diode structures using gallium antimonide (GaSb), GaAs, indium phosphide (InP), and silicon-germanium (SiGe) alloys at room temperature and at 77 kelvins.

In 1957, Braunstein further demonstrated that the rudimentary devices could be used for non-radio communication across a short distance. As noted by Kroemer[18] Braunstein".. had set up a simple optical communications link: Music emerging from a record player was used via suitable electronics to modulate the forward current of a GaAs diode. The emitted light was detected by a PbS diode some distance away. This signal was fed into an audio amplifier, and played back by a loudspeaker. Intercepting the beam stopped the music. We had a great deal of fun playing with this setup." This setup presaged the use of LEDs for optical communication applications.


Diagram of the tunnel diode constructed on a zinc diffused area of gallium arsenide semi-insulating substrate
In the fall of 1961, while working at Texas Instruments Inc. in Dallas, TX, James R. Biard and Gary Pittman found that gallium arsenide (GaAs) emitted infrared light when electric current was applied. On August 8, 1962, Biard and Pittman filed a patent titled "Semiconductor Radiant Diode" based on their findings, which described a zinc diffused p–n junction LED with a spaced cathode contact to allow for efficient emission of infrared light under forward bias.
[/Excerpt]


RCA would be before both of your time's ...............................
TI ring any bells??

[Excerpt]
History
The advent of low-cost computers on integrated circuits has transformed modern society. General-purpose microprocessors in personal computers are used for computation, text editing, multimedia display, and communication over the Internet. Many more microprocessors are part of embedded systems, providing digital control over myriad objects from appliances to automobiles to cellular phones and industrial process control.

The first use of the term "microprocessor" is attributed to Viatron Computer Systems describing the custom integrated circuit used in their System 21 small computer system announced in 1968.

Intel introduced its first 4-bit microprocessor 4004 in 1971 and its 8-bit microprocessor 8008 in 1972. During the 1960s, computer processors were constructed out of small and medium-scale ICs—each containing from tens of transistors to a few hundred. These were placed and soldered onto printed circuit boards, and often multiple boards were interconnected in a chassis. The large number of discrete logic gates used more electrical power—and therefore produced more heat—than a more integrated design with fewer ICs. The distance that signals had to travel between ICs on the boards limited a computer's operating speed.

In the NASA Apollo space missions to the moon in the 1960s and 1970s, all onboard computations for primary guidance, navigation and control were provided by a small custom processor called "The Apollo Guidance Computer". It used wire wrap circuit boards whose only logic elements were three-input NOR gates.[8]

The first microprocessors emerged in the early 1970s and were used for electronic calculators, using binary-coded decimal (BCD) arithmetic on 4-bit words. Other embedded uses of 4-bit and 8-bit microprocessors, such as terminals, printers, various kinds of automation etc., followed soon after. Affordable 8-bit microprocessors with 16-bit addressing also led to the first general-purpose microcomputers from the mid-1970s on.

Since the early 1970s, the increase in capacity of microprocessors has followed Moore's law; this originally suggested that the number of components that can be fitted onto a chip doubles every year. With present technology, it is actually every two years,[9] and as such Moore later changed the period to two years.[10]
[/Excerpt]

Never heard of Viatron eithre right??

[Excerpt]
Viatron
From Wikipedia, the free encyclopedia


Viatron Computer Systems, or simply Viatron was an American computer company headquartered in Bedford, Massachusetts, and later Burlington, Massachusetts. Viatron may have coined the term "microprocessor".[1]

Viatron was founded in 1967 by engineers from Mitre Corporation led by Dr. Edward M. Bennett and Joseph Spiegel. In 1968 the company announced its System 21 small computer system together with its intention to lease the systems starting at a revolutionary price of $40 per month. The basic system included a microprocessor with 512 characters of read/write RAM memory, a keyboard, a 9-inch (23 cm) CRT display and two cartridge tape drives.[2]

The system specifications, advanced for 1968 – five years before the advent of the first commercial personal computers – caused a lot of excitement in the computer industry. The System 21 was aimed, among others, at applications such as mathematical and statistical analysis, business data processing, data entry and media conversion, and educational/classroom use.

The expectation was that the use of new large scale integrated circuit technology (LSI) and volume would enable Viatron to be successful at lower margins, however the prototype did not incorporate LSI technology. In 1960 Bennett claimed that by 1972 Viatron would have delivered more "digital machines" than had "previously been installed by all computer makers." He declared "We want to turn out computers like GM turns out Chevvies,"[3]

The semiconductor industry was unable to produce circuits in the volumes required, forcing Viatron to sell fewer than the planned 5,000–6,000 systems per month. This raised the production costs per unit and prevented the company from ever achieving profitability.

Bennett was fired in 1970, and the company declared Chapter XI bankruptcy in 1971.[1]
[/Excerpt]

• The Computer Microchip: Modern microchips descend from integrated circuits used in the Apollo Guidance Computer.

• Light-Emitting Diodes (LED): Developed by NASA,the red light-emitting diodes were used to grow plants in space. Later this technology was developed in medical devices for muscle pain relief/relaxation, joint pain, arthritis and muscle spasms. Later generations of the technology are used to combat the symptoms of bone atrophy, multiple sclerosis, diabetic complications and Parkinson's disease.

LAOROSA DESIGN-JUNKY 26 NASA Inventions That We Take For Granted Everyday...
ROFL... NASA had nothing to do with the history of microchips other than being a customer.

LEDs The first one was created by a russian in 1927.
 
Back
Top Bottom