Jump to content

Intel pledges 80 cores in five years


Recommended Posts

Looks like gaming will get very extreme in five years. 80 cores! I am building my Intel Core 2 Duo soon and I want 1GB per core. That means in 5 years, our systems might have room for 80 GB RAM. :-

 

Quad cores coming in November. Exciting stuff! My computer gaming will rock any coming/current gaming console being released this year.

 

Linky pop

SAN FRANCISCO--Intel has built a prototype of a processor with 80 cores that can perform a trillion floating-point operations per second.

 

CEO Paul Otellini held up a silicon wafer with the prototype chips before several thousand attendees at the Intel Developer Forum here Tuesday. The chips are capable of exchanging data at a terabyte a second, Otellini said during a keynote speech. The company hopes to have these chips ready for commercial production within a five-year window.

 

Intel uses its twice-yearly conference to educate developers on its long-term and short-term plans. Over three days, hardware developers and partners get a chance to interact with Intel employees and take classes on new technologies.

Intel's 80-core chips

 

As expected, Intel announced plans to have quad-core processors ready for its customers in November. An extremely fast Core 2 Extreme processor with four cores will be released then, and the newly named Core 2 Quad processor for mainstream desktops will follow in the first quarter of next year, Otellini said.

 

The quad-core server processors are on a similar trajectory, with a faster Xeon 5300 processor scheduled for November and a low-power Xeon slated for the first quarter. Intel's first quad-core processors are actually two of its dual-core Core architecture chips combined into a multichip package.

 

"Performance matters again," Otellini said, disclosing that the quad-core desktop processor will deliver 70 percent faster integer performance than the Core 2 Duo, and the quad-core server processor will be 50 percent faster than the Xeon 5100 introduced in June.

 

One reason performance didn't matter to Intel over the last couple of years was because it was getting trounced on benchmarks at the hands of Advanced Micro Devices' Opteron and Athlon 64 server and desktop processors. That all changed with the introduction of the Core 2 Duo chips this year.

 

"With this new set of dual and quad-core processors, we've regained our leadership," Otellini told developers. The growing Internet video phenomenon, as evidenced by the spectacular rise of Web sites like YouTube, will keep these processors busy during intensive tasks like video editing, he said.

 

Notebooks will get a face-lift next year with the Santa Rosa platform, which will provide notebooks with new technologies like 802.11n wireless and flash memory. Intel believes that it will be the first to add flash memory to a notebook motherboard, which will improve boot times and reduce power consumption, Otellini said.

 

System power consumption is only one part of the equation. Over the next few years, Intel wants to improve the performance per watt of power consumption of its transistors by 300 percent through new manufacturing technologies and designs, Otellini said. The next step on that road, Intel's 45-nanometer manufacturing technology, will enable the company to build chips that deliver a 20 percent improvement in performance with five times less current leakage, he said.

 

But the ultimate goal, as envisioned by Intel's terascale research prototype, is to enable a trillion floating-point operations per second--a teraflop--on a single chip. Ten years ago, the ASCI Red supercomputer at Sandia National Laboratories became the first supercomputer to deliver 1 teraflop using 4,510 computing nodes.

 

Intel's prototype uses 80 floating-point cores, each running at 3.16GHz, said Justin Rattner, Intel's chief technology officer, in a speech following Otellini's address. In order to move data in between individual cores and into memory, the company plans to use an on-chip interconnect fabric and stacked SRAM (static RAM) chips attached directly to the bottom of the chip, he said.

 

Intel's work on silicon photonics, including its recent announcement of a silicon laser, could help contribute toward the core-to-core connection challenge. Rattner and Prof. John Bowers of the University of California at Santa Barbara demonstrated Intel's newest breakthrough model of silicon laser, which was constructed using conventional techniques that are better suited to volume manufacturing than older iterations of the laser.

 

Many of the architectural nuances of the 80-core chip can be traced back to earlier research breakthroughs announced at previous IDFs. Connecting chips directly to each other through tiny wires is called Through Silicon Vias, which Intel discussed in 2005. TSV will give the chip an aggregate memory bandwidth of 1 terabyte per second.

 

Intel, meanwhile, began to discuss replacing wires with optical technology in computers and chips in 2001 and has come out with several experimental parts for enabling lasers and optical technology to replace wires.

 

The same year, Intel began to warn about the dangers of heat dissipation in processors. One of the solutions, the company said at the time, lay in producing chips with multiple cores.

 

Wow...

This is going to F*ck up the gaming market for those who can't upgrade...

(I'm saving up already just incase I need to buy a new PC...)

"Geez. It's like we lost some sort of bet and ended up saddled with a bunch of terrible new posters on this forum."

-Hurlshot

 

 

Link to comment
Share on other sites

Excellent contribution to the discussion, Hades. :blink:

 

I doubt very much we'll see 80-core chips in stores in five years, though the 45nm die (that must be the physical limit of visible EMR to print the circuits on the wafer) and new info on laptops is interesting.

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Link to comment
Share on other sites

It's not like the X amount of cores matter for the end-user. And once programmers/coders get a grip on multi threaded programming, the amount of available cores will be of (almost) no matter for them either. Alan Wake (from the finnish superprogammers Remedy) will use five different threads, for example, no matter how many cores your CPU will have.

Swedes, go to: Spel2, for the latest game reviews in swedish!

Link to comment
Share on other sites

AI is set to surpass a single human's intellectual capacity within 30 years. And the total capacity of humanity in about 35.

 

The humans are in good hands, as the children of humanity will take good care of them. :D

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Link to comment
Share on other sites

AI is set to surpass a single human's intellectual capacity within 30 years. And the total capacity of humanity in about 35.

Off Topic: IMO a human-designed intelligence simulator can never surpass the reasoning abilities of a human. You can write code to implement perfect logic, and you can probably write code to simulate intuitive human reasoning to such a degree as to provide the illusion of reasoning and self-awareness. But it's all just human-created code that tries to replicate the way we believe our intelligence works. If teh machines decide to take over our world, they are not going to be able to out-reason humans. Yeah, they can probably compute battle tactics a hell of a lot quicker and with perfect precision, and this may be enough to wipe out our race. :blink:

Link to comment
Share on other sites

If teh machines decide to take over our world, they are not going to be able to out-reason humans. Yeah, they can probably compute battle tactics a hell of a lot quicker and with perfect precision, and this may be enough to wipe out our race. :blink:

I actually think with their perfect logic, they won't see the point of conquering the world, or for that matter, existing at all, unless you give them a stimulus to exist.

Link to comment
Share on other sites

What about the Intelligence that is created by the A.I.? And so on ...

I feel it will be significantly stupider than the Intelligence that created it. At any level of existence, an Intelligence really has no clue about how it really functions. This includes humans. The best it can do is try and replicate some of the ways in which it has observed itself to respond to stimuli. Yes, it can probably create entities that are more efficient than itself in performing certain tasks (e.g., humans creating computers, AI discovering bugs and creating new revised versions of itself) but there's no way it can create entities with totally new reasoning abilities. Unless.......... random mutation/natural selection is applied (even then I'm not totally sure).

 

I've seen the power of random mutation + natural selection in an algorithms class project I did during undergrad... we had a virtual maze with food strewn about on it, and we had to design ants that could hunt for the food in the least possible time. "Survival of the fittest" was used to weed out unsuccessful ants, and random mutation was used to build the state machine to define the behavior of the next generation of ants. After several hours of simulation, we ended up with some freakishly intelligent ants. It was all expected, of course, but to see such logic arise out of a random process was slightly sobering.

Link to comment
Share on other sites

Errr, I'm not sure exactly what you are referring to here, as heuristics in computing science are used as part of algorithms. A* is a pathfinding algorithm that incorporates a heuristic. Without the heuristic, it ceases to be A* and becomes Djikstra's Algorithm.

Link to comment
Share on other sites

What about the Intelligence that is created by the A.I.? And so on ...

I feel it will be significantly stupider than the Intelligence that created it. At any level of existence, an Intelligence really has no clue about how it really functions. This includes humans. The best it can do is try and replicate some of the ways in which it has observed itself to respond to stimuli. Yes, it can probably create entities that are more efficient than itself in performing certain tasks (e.g., humans creating computers, AI discovering bugs and creating new revised versions of itself) but there's no way it can create entities with totally new reasoning abilities. Unless.......... random mutation/natural selection is applied (even then I'm not totally sure).

 

I've seen the power of random mutation + natural selection in an algorithms class project I did during undergrad... we had a virtual maze with food strewn about on it, and we had to design ants that could hunt for the food in the least possible time. "Survival of the fittest" was used to weed out unsuccessful ants, and random mutation was used to build the state machine to define the behavior of the next generation of ants. After several hours of simulation, we ended up with some freakishly intelligent ants. It was all expected, of course, but to see such logic arise out of a random process was slightly sobering.

Did you just contradict yourself and agree with me? Or did I misunderstand your post?

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Link to comment
Share on other sites

Did you just contradict yourself and agree with me? Or did I misunderstand your post?

Yeah, I did do an about turn there. What I meant was, if you try to program your own reasoning abilities into an artificial intelligence, you won't get anywhere. If, however, you just replicate nature's survival of the fittest laws and sit back and enjoy the fun, you could possibly end up with something interesting.

Link to comment
Share on other sites

You might get into a lot of trouble if you condition it with darwinistic tendencies before hitting the "On" switch... It might decide that it is a superior being and need to ensure it's own survival against other threats :blink:

“He who joyfully marches to music in rank and file has already earned my contempt. He has been given a large brain by mistake, since for him the spinal cord would surely suffice.” - Albert Einstein

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...