Don t Let Anybody Tell You That Businesses Create Jobs

That is a lie Comrade, as you know. The laborer follows the instructions of his supervisor. Often the laborer has no idea at all what it is he is producing, how the button he pushes to start a machine making a widget goes into the final product. And it doesn't matter, they are just unskilled labor, easily replaced and of no real consequence to the process or the product.

Thank you Mrs. Helmsley.
 
You gave a definition that is irrelevant to a legal deduction in taxes.

Epic fail, as always.

Serious question, have you ever held a job? Have you ever in your life, performed a task that another person would willingly pay you for?

Or is looting the extent of your life?

I'll make it easier for you.

Money you didn't get.
 
'Apple lacks the technical expertise of Microsoft', care to elaborate on that? Or can you?

Microsoft created capacitive touch interfaces, TFT inductive screens, reactive LCD technology, etc.

Microsoft has the best technical minds in the world, creating technology unrivaled by anyone. The small minded had a fantasy of Microsoft and Apple in some technological war. That never existed, Apple simply licensed the technology from Microsoft and went ahead with their plans. Apple never attempted to create rival technology, why would they when they could simply license what they need.

Again, you are but a looter, you have no clue how real business operates.
 
'Apple lacks the technical expertise of Microsoft', care to elaborate on that? Or can you?

Microsoft created capacitive touch interfaces, TFT inductive screens, reactive LCD technology, etc.

Microsoft has the best technical minds in the world, creating technology unrivaled by anyone. The small minded had a fantasy of Microsoft and Apple in some technological war. That never existed, Apple simply licensed the technology from Microsoft and went ahead with their plans. Apple never attempted to create rival technology, why would they when they could simply license what they need.

Again, you are but a looter, you have no clue how real business operates.
ROFL dude put down the pipe. All that stuff came out when gates was still in diapers.
 
Employees DO make all of the money for the company.

If you employ, why would you make any of the money?

A platform is a place (actual or virtual) and rules and direction. Everything else is day-to-day operation which is (should be) employee controlled.

An employer DOES collect the money. It goes into his/her account.

Does an employee receive compensation? Yes

Does an employee receive a percentage? Commission employees do. Employee owned corporations do (kinda of).

Anything else?

Yo Peanut ... There are no employees without an employer.
Everyone has the right to start their own business if they think they can do it without an employer.

The Verve - Bitter Sweet Symphony - YouTube

.
 
ROFL dude put down the pipe. All that stuff came out when gates was still in diapers.

Not even close.

Everything in computing is built on earlier technology; but MS is behind true touch technology;

{Multitouch technology struggled in the mainstream, appearing in specialty devices but never quite catching a big break. One almost came in 2002, when Canada-based DSI Datotech developed the HandGear + GRT device (the acronym "GRT" referred to the device's Gesture Recognition Technology). The device's multipoint touchpad worked a bit like the aforementioned iGesture pad in that it could recognize various gestures and allow users to use it as an input device to control their computers. "We wanted to make quite sure that HandGear would be easy to use," VP of Marketing Tim Heaney said in a press release. "So the technology was designed to recognize hand and finger movements which are completely natural, or intuitive, to the user, whether they're left- or right-handed. After a short learning-period, they're literally able to concentrate on the work at hand, rather than on what the fingers are doing."}

From touch displays to the Surface A brief history of touchscreen technology Ars Technica
 
So the end justifies the means?

If you don't like it ... Do it better yourself.
If you think it is unjust ... Start a business and do it the way you want to.

Then you may actually understand what an employer provides for their employees.

.
 
Amusing, 1230+ posts and nobody can refute Hillary Clinton.

Do you honestly think that sort of idiocy helps your case?

Hillary spouted warmed over Marxism that was too stupid to take seriously.

Without business, there are no jobs in a free market - period.

That you would substitute forced labor under the whip of the state does not alter the fact that in a free market, business is the source of jobs.
 
Amusing, 1230+ posts and nobody can refute Hillary Clinton.
Which statement are we supposed to refute?

The one where she says businesses don't create jobs or the next one where she says businesses do create jobs?
 
15th post
• The Computer Microchip: Modern microchips descend from integrated circuits used in the Apollo Guidance Computer.

• Light-Emitting Diodes (LED): Developed by NASA,the red light-emitting diodes were used to grow plants in space. Later this technology was developed in medical devices for muscle pain relief/relaxation, joint pain, arthritis and muscle spasms. Later generations of the technology are used to combat the symptoms of bone atrophy, multiple sclerosis, diabetic complications and Parkinson's disease.

LAOROSA DESIGN-JUNKY 26 NASA Inventions That We Take For Granted Everyday...

Once again you should not use junk web sites .....................

[Excerpt]
Background
Electroluminescence, the natural phenomena upon which LED technology is built was discovered in 1907 by British radio researcher and assistant to Guglielmo Marconi , Henry Joseph Round, while experimenting with silicon carbide and a cats whisker.
During the 1920s, Russian radio researcher Oleg Vladimirovich Losev was studying the phenomena of electroluminescence in the diodes used in radio sets. In 1927, he published a paper called Luminous carborundum [silicon carbide] detector and detection with crystals about his research, and while no practical LED was created at that time based on his work, his research did influence future inventors.

Years later in 1961, Robert Biard and Gary Pittman invented and patented an infrared LED for Texas instruments. This was the first LED, however, being infrared it was beyond the visible light spectrum. Humans can not see infrared light. Ironically, Baird and Pittman only accidentally invented a light emitting diode while the pair were actually attempting to invent a laser diode.

Visible LEDs
In 1962, Nick Holonyack, a consulting engineer for General Electric Company, invented the first visible light LED. It was a red LED and Holonyack had used gallium arsenide phosphide as a substrate for the diode.
Holonyack has earned the honor of being called the "Father of the light emitting diode" for his contribution to the technology. He also holds 41 patents and his other inventions include the laser diode and the first light dimmer. (Another interesting fact about Holonyack was that he was once the student of John Bardeen, the co-inventor of the transistor.)

In 1972, electrical engineer, M George Craford invented the first yellow colored LED for the Monsanto Company using gallium arsenide phosphide in the diode. Craford also invented a red LED that was 10 times brighter than Holonyack's.

It should be noted that the Monsanto Company was the first to mass-produce visible LEDs. In 1968, Monsanto produced red LEDs used as indicators. But it was not until the 1970s that LEDs became popular, when Fairchild Optoelectronics began producing low-cost LED devices (less than five cents each) for manufacturers.

In 1976, Thomas P. Pearsall invented a high-efficiency and extremely bright LED for use in fiber optics and fiber telecommunications. Pearsall invented new semiconductor materials optimized for optical fiber transmission wavelengths.

In 1994, Shuji Nakamura invented the first blue LED using gallium nitride.
[Excerpt]
Who Invented LED or the Light Emitting Diode

• The Computer Microchip: Modern microchips descend from integrated circuits used in the Apollo Guidance Computer.
.

Comprehension is the word of the day, the way that reads is ....................

[Excerpt]

Integrated circuit
From Wikipedia, the free encyclopedia
(Redirected from Microchip)
"Silicon chip" redirects here. For the electronics magazine, see Silicon Chip.
"Microchip" redirects here. For other uses, see Microchip (disambiguation).

Erasable programmable read-only memory integrated circuits. These packages have a transparent window that shows the die inside. The window allows the memory to be erased by exposing the chip to ultraviolet light.

Integrated circuit from an EPROM memory microchip showing the memory blocks, the supporting circuitry and the fine silver wires which connect the integrated circuit die to the legs of the packaging.

Synthetic detail of an integrated circuit through four layers of planarized copper interconnect, down to the polysilicon (pink), wells (greyish), and substrate (green)
An integrated circuit or monolithic integrated circuit (also referred to as an IC, a chip, or a microchip) is a set of electronic circuits on one small plate ("chip") of semiconductor material, normally silicon. This can be made much smaller than a discrete circuit made from independent components. ICs can be made very compact, having up to several billion transistors and other electronic components in an area the size of a fingernail. The width of each conducting line in a circuit can be made smaller and smaller as the technology advances; in 2008 it dropped below 100 nanometer,[1] and now it is tens of nanometers.[2]

ICs were made possible by experimental discoveries showing that semiconductor devices could perform the functions of vacuum tubes and by mid-20th-century technology advancements in semiconductor device fabrication. The integration of large numbers of tiny transistors into a small chip was an enormous improvement over the manual assembly of circuits using discrete electronic components. The integrated circuit's mass production capability, reliability, and building-block approach to circuit design ensured the rapid adoption of standardized integrated circuits in place of designs using discrete transistors.

There are two main advantages of ICs over discrete circuits: cost and performance. Cost is low because the chips, with all their components, are printed as a unit by photolithography rather than being constructed one transistor at a time. Furthermore, much less material is used to construct a packaged IC die than to construct a discrete circuit. Performance is high because the components switch quickly and consume little power (compared to their discrete counterparts) as a result of the small size and close proximity of the components. As of 2012, typical chip areas range from a few square millimeters to around 450 mm2, with up to 9 million transistors per mm2.

Integrated circuits are used in virtually all electronic equipment today and have revolutionized the world of electronics. Computers, mobile phones, and other digital home appliances are now inextricable parts of the structure of modern societies, made possible by the low cost of producing integrated circuits.

Invention
Main article: Invention of the integrated circuit
Early developments of the integrated circuit go back to 1949, when the German engineer Werner Jacobi (Siemens AG)[6] filed a patent for an integrated-circuit-like semiconductor amplifying device[7] showing five transistors on a common substrate in a 3-stage amplifier arrangement. Jacobi disclosed small and cheap hearing aids as typical industrial applications of his patent. An immediate commercial use of his patent has not been reported.

The idea of the integrated circuit was conceived by a radar scientist working for the Royal Radar Establishment of the British Ministry of Defence, Geoffrey W.A. Dummer (1909–2002). Dummer presented the idea to the public at the Symposium on Progress in Quality Electronic Components in Washington, D.C. on 7 May 1952.[8] He gave many symposia publicly to propagate his ideas, and unsuccessfully attempted to build such a circuit in 1956.

A precursor idea to the IC was to create small ceramic squares (wafers), each one containing a single miniaturized component. Components could then be integrated and wired into a bidimensional or tridimensional compact grid. This idea, which looked very promising in 1957, was proposed to the US Army by Jack Kilby, and led to the short-lived Micromodule Program (similar to 1951's Project Tinkertoy).[9] However, as the project was gaining momentum, Kilby came up with a new, revolutionary design: the IC.


Jack Kilby's original integrated circuit
Newly employed by Texas Instruments, Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958.[10] In his patent application of 6 February 1959,[11] Kilby described his new device as “a body of semiconductor material … wherein all the components of the electronic circuit are completely integrated.”[12] The first customer for the new invention was the US Air Force.[13]

Kilby won the 2000 Nobel Prize in Physics for his part of the invention of the integrated circuit.[14] Kilby's work was named an IEEE Milestone in 2009.[15]

Noyce also came up with his own idea of an integrated circuit half a year later than Kilby. His chip solved many practical problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium.

Robert Noyce credited Kurt Lehovec of Sprague Electric for the principle of p–n junction isolation caused by the action of a biased p–n junction (the diode) as a key concept behind the IC.[16]

Fairchild Semiconductor was also home of the first silicon-gate IC technology with self-aligned gates, which stands as the basis of all modern CMOS computer chips. The technology was developed by Italian physicist Federico Faggin in 1968, who later joined Intel in order to develop the very first Central Processing Unit (CPU) on one chip (Intel 4004), for which he received the National Medal of Technology and Innovation in 2010.

Generations
In the early days of integrated circuits, only a few transistors could be placed on a chip, as the scale used was large because of the contemporary technology, and manufacturing yields were low by today's standards. As the degree of integration was small, the design process was relatively simple. Over time, millions, and today billions,[17] of transistors could be placed on one chip, and a good design required thorough planning. This gave rise to new design methods.

SSI, MSI and LSI
The first integrated circuits contained only a few transistors. Called "small-scale integration" (SSI), digital circuits containing transistors numbering in the tens provided a few logic gates for example, while early linear ICs such as the Plessey SL201 or the Philips TAA320 had as few as two transistors. The term Large Scale Integration was first used by IBM scientist Rolf Landauer when describing the theoretical concept[citation needed], from there came the terms for SSI, MSI, VLSI, and ULSI.

SSI circuits were crucial to early aerospace projects, and aerospace projects helped inspire development of the technology. Both the Minuteman missile and Apollo program needed lightweight digital computers for their inertial guidance systems; the Apollo guidance computer led and motivated the integrated-circuit technology,[18] while the Minuteman missile forced it into mass-production. The Minuteman missile program and various other Navy programs accounted for the total $4 million integrated circuit market in 1962, and by 1968, U.S. Government space and defense spending still accounted for 37% of the $312 million total production. The demand by the U.S. Government supported the nascent integrated circuit market until costs fell enough to allow firms to penetrate the industrial and eventually the consumer markets. The average price per integrated circuit dropped from $50.00 in 1962 to $2.33 in 1968.[19] Integrated circuits began to appear in consumer products by the turn of the decade, a typical application being FM inter-carrier sound processing in television receivers.

The next step in the development of integrated circuits, taken in the late 1960s, introduced devices which contained hundreds of transistors on each chip, called "medium-scale integration" (MSI).

They were attractive economically because while they cost little more to produce than SSI devices, they allowed more complex systems to be produced using smaller circuit boards, less assembly work (because of fewer separate components), and a number of other advantages.

Further development, driven by the same economic factors, led to "large-scale integration" (LSI) in the mid-1970s, with tens of thousands of transistors per chip.

Integrated circuits such as 1K-bit RAMs, calculator chips, and the first microprocessors, that began to be manufactured in moderate quantities in the early 1970s, had under 4000 transistors. True LSI circuits, approaching 10,000 transistors, began to be produced around 1974, for computer main memories and second-generation microprocessors.

VLSI
Main article: Very-large-scale integration

Upper interconnect layers on an Intel 80486DX2 microprocessor die
The final step in the development process, starting in the 1980s and continuing through the present, was "very large-scale integration" (VLSI). The development started with hundreds of thousands of transistors in the early 1980s, and continues beyond several billion transistors as of 2009.

Multiple developments were required to achieve this increased density. Manufacturers moved to smaller design rules and cleaner fabrication facilities, so that they could make chips with more transistors and maintain adequate yield. The path of process improvements was summarized by the International Technology Roadmap for Semiconductors (ITRS). Design tools improved enough to make it practical to finish these designs in a reasonable time. The more energy efficient CMOS replaced NMOS and PMOS, avoiding a prohibitive increase in power consumption.

In 1986 the first one megabit RAM chips were introduced, which contained more than one million transistors. Microprocessor chips passed the million transistor mark in 1989 and the billion transistor mark in 2005.[20] The trend continues largely unabated, with chips introduced in 2007 containing tens of billions of memory transistors.[21]

ULSI, WSI, SOC and 3D-IC
To reflect further growth of the complexity, the term ULSI that stands for "ultra-large-scale integration" was proposed for chips of complexity of more than 1 million transistors.[22]

Wafer-scale integration (WSI) is a system of building very-large integrated circuits that uses an entire silicon wafer to produce a single "super-chip". Through a combination of large size and reduced packaging, WSI could lead to dramatically reduced costs for some systems, notably massively parallel supercomputers. The name is taken from the term Very-Large-Scale Integration, the current state of the art when WSI was being developed.[23]

A system-on-a-chip (SoC or SOC) is an integrated circuit in which all the components needed for a computer or other system are included on a single chip. The design of such a device can be complex and costly, and building disparate components on a single piece of silicon may compromise the efficiency of some elements. However, these drawbacks are offset by lower manufacturing and assembly costs and by a greatly reduced power budget: because signals among the components are kept on-die, much less power is required (see Packaging).[24]

A three-dimensional integrated circuit (3D-IC) has two or more layers of active electronic components that are integrated both vertically and horizontally into a single circuit. Communication between layers uses on-die signaling, so power consumption is much lower than in equivalent separate circuits. Judicious use of short vertical wires can substantially reduce overall wire length for faster operation.[25][/Excerpt]

Now we have beat around and around that bush and you have come up empty every time, the source you site is a hack and I have volumes more research to disprove it ......................

In the highlighted section the phrase "led and motivated" does not mean invented !!
 
Last edited:
You do realize with a page rank of 0, I would have to look if they told me the sun was shinning .................

design-laorosa.com snoop summary
Report last updated: 1 minute ago
This is a free and comprehensive report about design-laorosa.com. The domain design-laorosa.com is currently hosted on a server located in Mountain View CA, United States with the IP address 216.239.34.21. The local currency for Mountain View CA, United States is USD ($). The website design-laorosa.com is expected to be earning an estimated $2 USD per day. If design-laorosa.com was to be sold it would possibly be worth $657 USD (based on the daily revenue potential of the website over a 12 month period). According to our google pagerank analysis, the url design-laorosa.com currently has a pagerank of 0/10. Our records indicate that design-laorosa.com receives an estimated 537 unique visitors each day - a decent amount of traffic!

design-laorosa.com

Junk sites make you out to be a laughing stock to your playmates .......................
 
The Apollo flight computer was the first to use integrated circuits (ICs). While the Block I version used 4,100 ICs, each containing a single 3-input NOR gate, the later Block II version (used in the crewed flights) used 2,800 ICs, each with dual 3-input NOR gates.[1]:34 The ICs, from Fairchild Semiconductor, were implemented using resistor-transistor logic (RTL) in a flat-pack. They were connected via wire wrap, and the wiring was then embedded in cast epoxy plastic. The use of a single type of IC (the dual NOR3) throughout the AGC avoided problems that plagued another early IC computer design, the Minuteman II guidance computer, which used a mix of diode-transistor logic and diode logic gates.
Apollo Guidance Computer - Wikipedia the free encyclopedia

First to use, doe's your wee little mind see that as "invented"??
Who provided them?? Fairchild Semiconductor??
So a rocket company invented them but had to go to a semiconductor company to buy them??

Do you have any "Dignity" left at all??
 
"Employees make all the money for an employer."

You're a ******* retard.

I'll tell you what, I will give you a pound of silicone, a basket of diodes and resisters, and soldering iron. In 10,000 years, you could not produce a single iPhone.

Every Communist in history has claimed that muscle builds everything - yet not a one of you can create shit.

If you employ, why would you make any of the money?

We've been through this before moron;

It is the mind of the person who creates the process and product that makes the money. It was Jack Kilby, who on his own time while employed by Texas Instruments, created the integrated circuit and changed the world - though you lie about this despite being corrected.

The assembler on the plant floor could never invent the IC. YOU tell the Communist that the laborer creates, but it was Kilby alone who created, then trained labor to assembly his creation.

Labor can be trained to replicate a task that an inventor or engineer developed - labor cannot create.

You lied about the microcomputer as well - created by Intel, a private company. Again, despite your lies, Robert Noyce created the 4004 in private industry, for a Japanese calculator company.

A platform is a place (actual or virtual) and rules and direction. Everything else is day-to-day operation which is (should be) employee controlled.

An employer DOES collect the money. It goes into his/her account.

Does an employee receive compensation? Yes

Does an employee receive a percentage? Commission employees do. Employee owned corporations do (kinda of).

Anything else?

An laborer is paid for labor - they do not create anything, they merely perform tasks as they are trained and instructed.

A laborer creates the end product for which the company makes all of their monies.


No, not really. Each labourer only makes a small part of the product. He may only apply a label or paint the casing. The labourer wouldn't be able to do jack squat without the tools and machinery provided by the factory owners.
 
Back
Top Bottom