Jump to content

Intel Kentsfield (Quad Core) Benchmarks


angshuman

Recommended Posts

It's not really a true "quad-core", but two dual-core Conroe chips glued together into a single package for a total of 4 cores. Of course, as an end-user you'll just see a single "chip" that you can plug into a 965 or 975X motherboard.

 

http://www.tomshardware.com/2006/09/10/fou...on_the_rampage/

Edited by angshuman
Link to comment
Share on other sites

Intel is really trying to keep AMD on their toes, but it does seem a little rushed. I guess they're trying to take the steam out of AMDs "4x4" system -- I can't wait to see how they perform against one another :cool:.

 

One problem I see with AMDs solution is whether the low end versions of Vista will support it or not. With windows XP -- the "home edition" only supports one processor while the "pro edition" supports up to two. I figure Vista might also be doing the same with its 5 or so different versions.

Link to comment
Share on other sites

My big problem is not so much the competition at all ... it's more the lax coding for single-core 32-bit ... what has actually been written for dual-core 32-bit ... need alone 64-bit? Nothing ... ? Tech's good, but with not software to back it up we have a detente with no way to demonstrate quality ... does that make sense?

The universe is change;
your life is what our thoughts make it
- Marcus Aurelius (161)

:dragon:

Link to comment
Share on other sites

Well, there's lots of software written to utilise multiple cores. The problem is, very little of it exists outside of the professional realm (graphic design, servers, etc..). I figure we'll start to see multithreaded PC games sometime soon, especially with the advent of both the XBox 360 and PS3 using multiple cores.

 

Thinking about it, much of the software existing today wouldn't benefit from being multithreaded. Even if it would benefit from being multithreaded -- many applications wouldn't even need the extra power. Notepad at 500 FPS :ermm:.

Link to comment
Share on other sites

multi-threading in that light is good if you are a multi-tasker, but most gamers are concentrating on their games. as noted above, true threading barely exists outside of the professional realm (big time in defense, btw). what we need is threading within a process. we've talked about it before, and it is something we will see in our lifetimes, but not yet.

 

my guess is that a core-duo, or two core-duos tied together, will perform nearly identically as a single core of a core-duo running a single app. there will be minor differences since housekeeping tasks the OS regularly runs can be offloaded to another processor, but that does not amount to much.

 

taks

comrade taks... just because.

Link to comment
Share on other sites

oh, btw, i should note that parallel processing actually works out to a logarithmic increase in speed, btw. i.e. 2 processors perform as well as a sqrt(2) faster single processor, four are twice as fast. this is not a rule, and not exact, but more of something i've seen with my own work. the inefficiency is due to many things, but primarily (in no specific order): memory conflicts (they all share the same memory), I/O bandwidth (I/O is fast enough to feed one processor, but 4 quickly limits the amount of data each processor gets) and dependent operations. the latter is hit or miss, since some threads are easy to separate without any dependencies, but not all fall into this category.

 

taks

comrade taks... just because.

Link to comment
Share on other sites

Sure a single application (read: game) might not be the best use of multiple threading, but there are definite advantages.

 

Firstly, it allows for added security / stability (run the critical kernel processes on one core, WITH its own cache (and RAM, if necessary), like the Security Reference Monitor

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Link to comment
Share on other sites

Firstly, it allows for added security / stability (run the critical kernel processes on one core, WITH its own cache (and RAM, if necessary), like the Security Reference Monitor

The universe is change;
your life is what our thoughts make it
- Marcus Aurelius (161)

:dragon:

Link to comment
Share on other sites

Well, it has to be implemented properly.

 

Though Microsoft and Intel have been talking about secure hardware designs for over a decade now (nearly two, iirc), prevented merely by backward compatibility with the x86 (iirc, again).

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Link to comment
Share on other sites

Isn't secure hardware, however, always go to be susceptible to software hacks/algorithms - does that make sense?

I think I know what you're asking. Basically, you need to have a bulletproof "secure software" running on top of the secure hardware in order to achieve a totally secure system. An ideal secure hardware does two things: (a) provide abstractions so that secure software can be written to run on it, and (b) guarantees that as long as the software running on top of it is perfectly secure, it is impossible for a hacker to exploit any hardware weaknesses to compromise the system (e.g., physically tapping into the processor-memory bus). All layers of a secure system need to be perfect in order to make it watertight; any single loophole could compromise the entire system.

Link to comment
Share on other sites

So angshuman, if MS were try to ensure digital protection through hardware, someone could design software over it that could, hypothetically, allow for copying - correct? I always wonder why there is such a resistance to embracing open-source paradigm and the revenue stream shift it offers, then spending a lot of money to protect the antiquated idea of 'copyright.' It sort of seems a financial investment that will always drain resources ...

The universe is change;
your life is what our thoughts make it
- Marcus Aurelius (161)

:dragon:

Link to comment
Share on other sites

Oh, they've thought this through :D. Your content will be encoded using a key such that it can be decoded only by favored software and hardware. Together, the hardware, software, content and display device form a system stack that is (ideally) completely impenetrable.

 

They already tried this to a certain extent with CSS for DVDs. CSS was designed such that if you wished to write software to decode a DVD, you'd have to register your company with the regulatory authority and license the key from them. This is why a lot of open-source DVD decoders cannot play CSS-encrypted DVDs. Unfortunately (or fortunatel,y depending on your point of view), the system had several pitfalls, since only the OS, decoder and content formed the "secure" subsystem: (1) you could theoretically hack into the hardware that was running the secure OS/decoder, (2) you could redirect the final VGA output away from your display device into a capture device, and (3) the CSS key system itself was easily cracked making the whole deal a joke.

 

This time, they have been much more careful. The MPAA has pulled Intel, Microsoft, and Monitor manufacturers into the cartel. Your content, hardware, software and even your display device are now part of the secure subsystem. As long as you are using un-encrypted content, you should have no issues. But if you use encrypted content, all components of your system involved in translating the content from its original form to something that you can perceive needs to be "compliant" to the requirements of the content provider.

Link to comment
Share on other sites

I'm curious what revenue streams open-source software will provide versus the antiquated copyright.  And in particular, if it'd be able to make up the loss of revenue by no longer being closed.

 

I think that is a very good and challenging question. I think that there have been enough start-ups in the last few years (rather reminiscent of the .com bubble) that revolve around open-source add-ons/tweaks/products that the sky is really the limit. It's just whether or not there is a strong enough dissatisfaction with the consumer base of closed systems ... not sure if we're there yet, but it'll be interesting to see how the open source community does/does not respond to Vista's pressure on its supply chain highhandedness ... or so my imagining likes to flirt ...

The universe is change;
your life is what our thoughts make it
- Marcus Aurelius (161)

:dragon:

Link to comment
Share on other sites

It is very hard to make an open-standards system work when your customer and your adversary are one and the same. Once you start treating your customer as your adversarial intruder, the only way in which you can get them to use the content you provide according to your dictated terms and conditions is through a fortress-like closed system.

 

Open standards can work if these policies generate enough distaste among customers to convince them to start moving towards more independent content providers that are willing to work with open-standards systems. Without a drive from both the consumers as well as the content providers, it's probably not going to happen.

Link to comment
Share on other sites

Just to clarify, what are you talking about with respect to Open Source?

 

The term gets tossed around quite a bit to apply to a variety of situations. Some use it to describe free software, while some stick to Stallman's interpretation of being "free software." Not free as in zero cost, but free as in freedom as in access to the source code.

Link to comment
Share on other sites

Just to clarify, what are you talking about with respect to Open Source?

 

The term gets tossed around quite a bit to apply to a variety of situations.  Some use it to describe free software, while some stick to Stallman's interpretation of being "free software."  Not free as in zero cost, but free as in freedom as in access to the source code.

I was talking about open standards, not open source. Much as I love and use open source software, I can understand when people claim it is not entirely feasible to build a profitable business around the model. Open standards are publicly-available protocols that anyone is free to create content or software for (e.g., TCP/IP, POP, the Windows API, the x86 ISA, etc.). I'm not sure if I'm using this term accurately either, but all I want to say is that just like I can write an HTTP browser for myself to browse the internet, and just like I can write my own OS for my Intel machine, I should be able to write a DVD decoder software for myself if I wish and distribute it to my friends.

Link to comment
Share on other sites

Ah yes, I like open standards too.

 

Stuff like OpenGL.

 

I don't know if DirectX is technically an open standard (I have no idea), but at least they dropped the licensing fee for it so it is free to use.

 

The thing I hated most about 3dfx was there insistence on trying to get support for Glide. As nVidia and ATI became more common though, it was less utilized because developers weren't going to cut out huge parts of the installed base.

 

The fact that many of them (at least up to Voodoo3) never natively supported OpenGL and it had to use the classic MiniGL wrapper was also mind boggling.

Link to comment
Share on other sites

Isn't secure hardware, however, always go to be susceptible to software hacks/algorithms - does that make sense?

Um, I think we're talking cross purposes: I'm talking about the kernel (the guts of the OS) running in a secure CPU, not an application running on top of the OS platform.

 

Sure you could integrate an application into the OS (much like Microsoft has done with lots of stuff, like those gorram Outlook and Messenger things ... but the more that gets stuck into the Executive layer, the easier it is to jam it.

 

Best to keep it simple. Not that M$ will.

 

Ah yes, I like open standards too.

 

Stuff like OpenGL.

 

I don't know if DirectX is technically an open standard (I have no idea), but at least they dropped the licensing fee for it so it is free to use.

 

The thing I hated most about 3dfx was there insistence on trying to get support for Glide.  As nVidia and ATI became more common though, it was less utilized because developers weren't going to cut out huge parts of the installed base.

 

The fact that many of them (at least up to Voodoo3) never natively supported OpenGL and it had to use the classic MiniGL wrapper was also mind boggling.

DirectX was a proprietary Windows technology ... not sure if it is spreading further now. OpenGL works on multiple platforms, including *nix.

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...