Jump to content

Computer speed increases will eventually be impossible


Humodour

Recommended Posts

For those interested in FTL communication, a few links.

 

ESA Experiments with Quantum communication and encryption

The quest for manipulating entangled photons

 

FTL communication makes no sense if the messages can only be dealt with by non-quantum machinery. Like being stranded on a remote island with a laptop and no power outlet ;)

“He who joyfully marches to music in rank and file has already earned my contempt. He has been given a large brain by mistake, since for him the spinal cord would surely suffice.” - Albert Einstein
 

Link to comment
Share on other sites

I have a question since it's been a while since I took physics, but what exactly are people referring to when they say they refer to "very near light speed in computation?"

 

 

 

 

(as an aside, I'm not sure where the speed of light came into the discussion. The article didn't say that we'd max out our speed because of the limit imposed on us by the speed of light, but rather that there was a limit much like how the speed of light is a physical limit).

 

Because the speed of light is (according to the theory of relativity) the ultimate limit of transferring information. No information can be sent faster than light or it will break causality. The information, say the result of the calculation made by a processor, can not reach us faster than a photon because that would break the laws of physics. Neither can the calculation itself happen faster than the speed of light.

 

Now that I read that article again it does seem to put a limit on how fast a single processor can be, separate from light speed. I doubt it's even near the physical limit of transferring information.

 

 

The issue with computer speeds isn't the speed of light, but rather the latency between signals. Electric signals already move hella fast. Making transistors smaller and narrowing the gaps means that it takes less time to move the gates, but the signals have already been traveling ridiculously fast for a long time now. Crappy coaxial cable has a propagation speed 2/3 the speed of light and that is pretty old technology by today's standards.

 

Moore's Law will eventually become false due to miniaturization issues, not because of limits imposed by the speed of light. We'll hit a point where transistors become irreducible because you won't be able to remove any of the molecules from it and still have it be a transistor.

 

But the speed of the signals actually traveling along the circuitry probably hasn't changed very much. It's probably pretty diminishing returns compared to miniaturizing the processors.

Link to comment
Share on other sites

Faster than light travel, as far as we know, would break causality.

 

I think I speak for all of us when I say "**** causality"

 

I follow Einstein's logic on why an object can't be accelerated beyond a certain speed. But that does not mean EFFECTs cannot be delivered at beyond that speed, as gravity demonstrates. e.g. if someone moved the sun the effects would be felt on Earth immediately, even if the change would not be seen for a while.

"It wasn't lies. It was just... bull****"."

             -Elwood Blues

 

tarna's dead; processing... complete. Disappointed by Universe. RIP Hades/Sand/etc. Here's hoping your next alt has a harp.

Link to comment
Share on other sites

I thought quantum computing was instantaneous?

 

Even the speed of light in a vacuum isn't instantaneous, Wals.

 

Did you know that if there were a vacuum metastability event which started annihilating this universe and rewriting the laws of physics, it would expand at the speed of light? That is: depending on where in the universe it began, humans might never know it was happening (even if we lived for millions, perhaps billions of years).

 

But for the record, quantum computing isn't a panacea. In some classes of algorithms it's orders of magnitude faster, in some it's no faster at all. In general it's faster than binary computing. Here's a good article on some recent developments in quantum computing algorithm speeds: http://www.physorg.com/news174286879.html

 

Also, what about parrallel processing?

 

That doesn't make things faster per se, just more efficient (assuming the algorithm is parallelisable).

 

Some algorithms are embarrassingly parallelisable. Some are impossible to parallelise. Parallelisation is more of an optimisation problem to be solved at a software level (although obvious it requires modifications at the hardware level such as extra cores and/or PUs).

Link to comment
Share on other sites

They test faster than light communications. They have estalished that it works. They haven't found out why it works yet, but once that hurdle is overcome, there might be some new quantum leaps (pun unintended) on its way in science.
From what I've read, photon entanglement-based communications still requires that you send packets over a distance. It isn't FTL, as mentioned in p.37 of the ESA doc you linked to.

 

http://en.wikipedia.org/wiki/Quantum_teleportation

 

 

I follow Einstein's logic on why an object can't be accelerated beyond a certain speed. But that does not mean EFFECTs cannot be delivered at beyond that speed, as gravity demonstrates. e.g. if someone moved the sun the effects would be felt on Earth immediately, even if the change would not be seen for a while.
"Effects" aren't delivered magically across space. In the case of gravity, it's a field that transfers energy (they are trying to find the particle responsible for that). Gravitational waves (effectively the equivalent of sudden changes like "moving the sun") don't travel faster than light -- they preserve causality. Edited by 213374U

- When he is best, he is a little worse than a man, and when he is worst, he is little better than a beast.

Link to comment
Share on other sites

Faster than light travel, as far as we know, would break causality.

 

I think I speak for all of us when I say "**** causality"

 

I follow Einstein's logic on why an object can't be accelerated beyond a certain speed. But that does not mean EFFECTs cannot be delivered at beyond that speed, as gravity demonstrates. e.g. if someone moved the sun the effects would be felt on Earth immediately, even if the change would not be seen for a while.

 

Any computer whose computations are based on the speed of propagation of radiation (including gravitic radiation but not gravitic fields) has a maximum speed as defined by the Planck time. I thought the physorg article mentioned this, but it did not.

 

Even if you could control the manipulation of the distortion of space-time to generate FTL communication (big if), I highly doubt it'd be possible to miniaturise that to Planck lengths (and hence Planck times) without tearing the device apart, making it rather meaningless as a concept. Unless you had some notion of building a gravitic computer the size of a planet or solar system or something.

 

Edit: I distinguish between the speed of gravitic radiation and propagation of gravitic fields (distortions in space-time) because the latter is so hard to compute, given gravity is so weak. Evidence seems to suggest that gravitic fields travel at the speed of light, however.

 

http://math.ucr.edu/home/baez/physics/Rela...grav_speed.html

 

Edit 2: What I'm trying to say is that, along with numbers's observation that quantum teleportation isn't FTL, this really is the limit to computer speeds. Anything that intends to violate this fundamental limit of physics would have to invent FTL communication in the process, and I think that would be news in and of itself!

Edited by Krezack
Link to comment
Share on other sites

Even if you could control the manipulation of the distortion of space-time to generate FTL communication (big if)
Yeah, big IF. Especially considering that the distortion of space-time (gravitational wave) does NOT travel faster than light.

- When he is best, he is a little worse than a man, and when he is worst, he is little better than a beast.

Link to comment
Share on other sites

I bet people a hundred years from now will laugh at the theories people are discussing today. No pick at anyone in particular, just saying what we know now compared to a hundred years ago.. it would look like magic if people even 30 years ago could see what was possible today.

 

Between people like Einstein and Planck, most of relativity and QM had been laid out and theorised by the start of the 20th century - id est: 100 years ago.

 

What we have today in terms of technology is all advances in engineering, in a manner of speaking. We've not violated any fundamental principles that were laid out in relativity and QM 100 years ago, as magical as our technology might appear.

Edited by Krezack
Link to comment
Share on other sites

I'm just saying, even thirty years ago, it was inconceivable for most people to believe that they would be using a device smaller than a packet of cards that can take thousands of photos, play hundreds of music tracks and videos, record home movies and talk to anyone around the world with. It was the stuff of Sci-Fi at the time. I agree with what Gorth said, "Which is why completely new thinking is required... As long as there are people who think we are limited by the speed of light, we will be limited by it."

Link to comment
Share on other sites

Anyway, regarding what the actual speed that this limit represents is, here goes: 10^16 times faster than the fastest computer today, which is roughly 10^15 flops. So that gives about 10^31 flops. It's actually smaller than I thought. It's more than enough to perfectly simulate a human brain, though, including gene regulation and molecule movement. Which is good because I doubt we'll ever actually get as fast this limit.

Link to comment
Share on other sites

They test faster than light communications. They have estalished that it works. They haven't found out why it works yet, but once that hurdle is overcome, there might be some new quantum leaps (pun unintended) on its way in science.
From what I've read, photon entanglement-based communications still requires that you send packets over a distance. It isn't FTL, as mentioned in p.37 of the ESA doc you linked to.

http://en.wikipedia.org/wiki/Quantum_teleportation

Yes, sadly the establishing of the connection is still slower than FTL, but hey, you need to set yourself some interesting challenges to overcome.

 

I don't think Benjamin Franklin envisioned all the practical uses of lightning, as much as he was simpy interested in understanding the mechanisms behind it :)

“He who joyfully marches to music in rank and file has already earned my contempt. He has been given a large brain by mistake, since for him the spinal cord would surely suffice.” - Albert Einstein
 

Link to comment
Share on other sites

I'm just saying, even thirty years ago, it was inconceivable for most people to believe that they would be using a device smaller than a packet of cards that can take thousands of photos, play hundreds of music tracks and videos, record home movies and talk to anyone around the world with. It was the stuff of Sci-Fi at the time.

 

And yet none of that required breaking fundamental physical laws.

 

I agree with what Gorth said, "Which is why completely new thinking is required... As long as there are people who think we are limited by the speed of light, we will be limited by it."

 

This isn't a matter of faith, though. This is a physical limit which requires breaking the speed of light to violate it. That's possibly the biggest thing that will ever happen in physics, if it's even possible.

 

So while I'm perfectly willing to accept that scientists need to dream, to imagine the impossible, ignore limits, etc, I think vesting your faith in the continuation of Moore's law on the assumption of breaking the biggest fundamental law in the universe is misguided.

 

Sometimes humans become very claustrophobic when confronted with the notion that the universe isn't infinite (in both time and length). I think this is a case of such claustrophobia. :)

Link to comment
Share on other sites

I don't think Benjamin Franklin envisioned all the practical uses of lightning, as much as he was simpy interested in understanding the mechanisms behind it :)
Oh, I agree. Where would we be without the great tinkerers? Case in point.

 

 

So while I'm perfectly willing to accept that scientists need to dream, to imagine the impossible, ignore limits, etc, I think vesting your faith in the continuation of Moore's law on the assumption of breaking the biggest fundamental law in the universe is misguided.
Pretty much.

 

Gravity was enunciated by Newton in the 1600s. Not only have we NOT found a way around gravity (despite our discovery of flight)... science has been further reinforcing and expanding Newton's postulates. You can only sidestep the fundamental barriers of nature so much.

- When he is best, he is a little worse than a man, and when he is worst, he is little better than a beast.

Link to comment
Share on other sites

I'm just saying, even thirty years ago, it was inconceivable for most people to believe that they would be using a device smaller than a packet of cards that can take thousands of photos, play hundreds of music tracks and videos, record home movies and talk to anyone around the world with. It was the stuff of Sci-Fi at the time. I agree with what Gorth said, "Which is why completely new thinking is required... As long as there are people who think we are limited by the speed of light, we will be limited by it."

 

What gorth said is kind of silly. We are limited by light speed like we are limited by time flowing forward instead of backward. For some reason our universe seems to have a score of unbreakable laws that kind of keep it together and running as it does. Sure you can believe that at some point our brilliant engineering will crush even the laws of physics themselves, but that's against all evidence we have and no different than believing magic is real.

Link to comment
Share on other sites

The law is named for Intel co-founder Gordon E. Moore, who introduced it in a 1965 paper.[8][9][10] It has since been used in the semiconductor industry to guide long term planning and to set targets for research and development.

 

The self fulfilling element here is interesting. Because the law exists, RD departments revise their goals so as not to appear 'behind'.

Na na  na na  na na  ...

greg358 from Darksouls 3 PVP is a CHEATER.

That is all.

 

Link to comment
Share on other sites

I agree with Gorth.

 

Basically, the computers today rely on time to deliver data, and a certain amount of it to be calculated within that timecycle. I wouldn't suggest trying to dupe the law of relativity, but rather enhancing on how the data is dealt by the computer. Thus, advancements within A.I. would be the key. "Smart" computers if you like the sound of it.

 

Then of course there is the immidiate problem of creating a programming model that generates a self-evolving entity that adheres to Asimov's rules. This perpetuates to the singularity problem with computers(robots) behaving more human than humans (think Bladerunner) and eventually ends with humankind (and robokind) becoming their own gods by transfering their counciousness to a self-reparing interstellar grid of "Internet", where they can rule their own domain by themselves or intertwine with one another. Eventually, man and machine has a shared conciousness and our humanity is forever changed.

 

But then, there's the damn enthropy.

"Some men see things as they are and say why?"
"I dream things that never were and say why not?"
- George Bernard Shaw

"Hope in reality is the worst of all evils because it prolongs the torments of man."
- Friedrich Nietzsche

 

"The amount of energy necessary to refute bull**** is an order of magnitude bigger than to produce it."

- Some guy 

Link to comment
Share on other sites

Meshugger, what you're talking about is making computers more efficient, not faster. I think someone (number man) already addressed that before.

 

I have no idea how neurons fit into that equation, but i really doubt they're even near the limit the scientists calculated. I don't doubt we'll be abled to make a computer as efficient, or even more efficient, than the human brain but there is a limit on how fast a single processor performing a single calculation or action can be.

Edited by Lare Kikkeli
Link to comment
Share on other sites

I agree with Numbers. We may laugh at some ideas, but Newton is proof that oldies can be goodies.

"It wasn't lies. It was just... bull****"."

             -Elwood Blues

 

tarna's dead; processing... complete. Disappointed by Universe. RIP Hades/Sand/etc. Here's hoping your next alt has a harp.

Link to comment
Share on other sites

Even a single processor has a great deal of parallelism in it, that's what's pipelining is all about.

 

Yep. Most people don't realise this. It's not just pipelining, either.

 

Bit-level parallelism, instruction parallelism (where pipelines come in), superscalar processing - and all these happen before you even get to multiple cores or CPUs. We've had parallel computing for decades.

 

Data-level parallelism and task parallelism are the new ones.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...