Analog computing for AI?

shoshi

Platinum Member
Oct 28, 2020
2,043
1,268
938
Analog computers were the most powerful in the 1940s and 50s. Now everything is digital but could the analog computer make a comeback for development of AI? Let us see what the future holds.
 
Interesting video. I used to work in the field of digital neural networks. One problem mentioned in the video is that there is no way to read out the values of the resistors in an analog system. I don't' think that is a problem. There have been a number of systems since the 1990's that used distributed digital processing and distributed memories to simulate neural networks for pattern recognition. Even though the values of the synapses can be read out, there are often too many to be decipherable or useful anyway.

Accuracy was not a problem because often only 8 bits were used for synapse weights. Since many neural cells are used in making a decision the granularity of summing many synapses washed out statistically.

So I think analog systems would work well as the video says. For pattern recognition there is a massive amount of image data input to the network, but output can be very simple, like an x, y location plus an angle.
.
 
It has always been an analog world. Those little chips and flip flops and the like just reduce operations to a smaller less visible scale and change the shape of the waveforms, is all. At higher speeds, timing becomes a bigger and bigger problem,so the rise and fall times become more important. Don't know how they can overcome those limitations without moving to some other transmission method besides waveforms, if that is possible; otherwise, they are going to use different points on the rising and falling sides to trigger signals to other devices to do some functions or other, and create a whole new set of timing issues with inputs and outputs.

Analog triggering has the possibility of making certain types of circuits capable of predicting values ahead of time and getting closer and closer to simulating parallel processing. I'm not an engineer so I'm just guessing, but my work with lasers did a lot of this type of thing using arrays that signaled different devices almost simultaneously. The mapping and timing got to be pretty complex studies in themselves.
 
Last edited:
I used to work in the field of digital neural networks
This is the earliest stage of AI that has been dismissed as unsuccessful. The pinnacle of AI is Minsky's Frame model and Hewitt's Actor model. These are the most recent concepts, after which the project was closed. Neural networks are just cheap marketing.
 
This is the earliest stage of AI that has been dismissed as unsuccessful. The pinnacle of AI is Minsky's Frame model and Hewitt's Actor model. These are the most recent concepts, after which the project was closed. Neural networks are just cheap marketing.
Our company sold thousands of systems that used Artificial Neural Networks for the semiconductor industries for pick and place systems. Another application was for gold ball bonding. The chips had to be precisely located so the wires could be attached. The system could be trained in seconds by an unskilled operator for different batches. Yes, AI certainly was used in marketing, but it wasn't "cheap" marketing.
 
Our company sold thousands of systems that used Artificial Neural Networks for the semiconductor industries for pick and place systems. Another application was for gold ball bonding. The chips had to be precisely located so the wires could be attached. The system could be trained in seconds by an unskilled operator for different batches. Yes, AI certainly was used in marketing, but it wasn't "cheap" marketing.
You don't do AI, you just have "programmers" and use the API.
 
For all the squalor that IT development has slipped into in hippie times, it has reached the bottom in the past 10 years. 10 years ago it was possible to use autocad on a processor on which paint does not work now. 10 years ago, the opera ran on its own engine and was fast as lightning and windows XP was lesser then a 1 Gb.
 
there has never been such a shameful development as now, there have never been so many morons. Now morons not only write programs, but develop operating system kernels and programming languages.
 
there has never been such a shameful development as now, there have never been so many morons. Now morons not only write programs, but develop operating system kernels and programming languages.
Which languages do you mean?
 
You don't do AI, you just have "programmers" and use the API.
What's wrong with you? You sure have a chip on your shoulder. You have no idea what I did.
I was the manager of R&D and developed these programs in-house and have patents. We spent at least 90% of the time to make the training part fast and user friendly.
.
 
C# was considered a Java clone when Microsoft first introduced it. It has advanced since then. Important language for game development and windows based apps. Apple created Kotlin because you can develop Android apps faster with less code than with Java. Google created Golang often called Go because it has the fast run time of C++ and the simplicity of Python. There are reasons new languages are created.
 
C# was considered a Java clone when Microsoft first introduced it. It has advanced since then. Important language for game development and windows based apps. Apple created Kotlin because you can develop Android apps faster with less code than with Java. Google created Golang often called Go because it has the fast run time of C++ and the simplicity of Python. There are reasons new languages are created.
The reasons are narrow specialization. The more ready-made tools, the more you can involve in the development of low-skilled personnel who do not know programming, but use ready-made "interfaces" instead.
 
fast run time of C++
There is no speed in reality. When they demonstrate speed, they simply use artificial use cases, such as how fast a loop runs. When we launch the program, a completely different story begins. The same legends circulated about java. They put on display some tests that were fast, but Java programs actually worked slowly and consumed all memory.
Despite static explicit typing, which in theory provides good opportunities for code optimization, they still fuked up
 
There is no speed in reality. When they demonstrate speed, they simply use artificial use cases, such as how fast a loop runs. When we launch the program, a completely different story begins. The same legends circulated about java. They put on display some tests that were fast, but Java programs actually worked slowly and consumed all memory.
Despite static explicit typing, which in theory provides good opportunities for code optimization, they still fuked up
I was taught that typing time is usually more important than the run time.
 
Last edited:
I cited opera as an example. It worked great on its own engine on my vintage laptop, the CPU is about 1GHz X 2GB of RAM.
Then the fucking google imposed on everyone their nerdy shit, and now all browsers are the same and shitty. Google search also works about 1000 times worse than 15 years ago
 
Last edited:
Dell has me using Java with Spring for the back end. Typescript for the client side. Another company like Google or Apple may use something different.
 
Somewhere in the days of windows 98, the office suite had a simple and well-thought-out interface, and worked so quickly that the developers deliberately slowed down the graphics for solidity, and this was fucking on antediluvian machines, where only Tetris worked from the games. Now they stuffed a bunch of useless shit in there, and it slows down, bugs and falls.
 
Unix was a simple, fast and sleek system, ideal for the web. Then 2 fuckers got in there with their GNU, they copied the author's code, fucked the design, and now unix does not develop, instead we have fucking penguin, in which idiots sit who spend hours figuring out how to stick a fucking font there.
 
Never used unix. I do know Linux operating system the Successor to Unix.
 

Forum List

Back
Top