Jump to content

angshuman

Members
  • Posts

    655
  • Joined

  • Last visited

Everything posted by angshuman

  1. I feel it will be significantly stupider than the Intelligence that created it. At any level of existence, an Intelligence really has no clue about how it really functions. This includes humans. The best it can do is try and replicate some of the ways in which it has observed itself to respond to stimuli. Yes, it can probably create entities that are more efficient than itself in performing certain tasks (e.g., humans creating computers, AI discovering bugs and creating new revised versions of itself) but there's no way it can create entities with totally new reasoning abilities. Unless.......... random mutation/natural selection is applied (even then I'm not totally sure). I've seen the power of random mutation + natural selection in an algorithms class project I did during undergrad... we had a virtual maze with food strewn about on it, and we had to design ants that could hunt for the food in the least possible time. "Survival of the fittest" was used to weed out unsuccessful ants, and random mutation was used to build the state machine to define the behavior of the next generation of ants. After several hours of simulation, we ended up with some freakishly intelligent ants. It was all expected, of course, but to see such logic arise out of a random process was slightly sobering.
  2. Dear GOD! No wonder you need to nitro-cool it... Edit: removed specs and comments, since there seems to be some speculation on the nature of the 128 "stream processors" on the G80.
  3. Off Topic: IMO a human-designed intelligence simulator can never surpass the reasoning abilities of a human. You can write code to implement perfect logic, and you can probably write code to simulate intuitive human reasoning to such a degree as to provide the illusion of reasoning and self-awareness. But it's all just human-created code that tries to replicate the way we believe our intelligence works. If teh machines decide to take over our world, they are not going to be able to out-reason humans. Yeah, they can probably compute battle tactics a hell of a lot quicker and with perfect precision, and this may be enough to wipe out our race.
  4. Nah, with dual cores the numbering scheme has gone haywire. Intel's also given up on the gigahurtz race and switched to their own numbering scheme, so the numbers just don't make any sense anymore . The "little Conroe" as mkreku said is the E6300 and costs about the same as an X2 4200 but performs better. You can't go wrong with this baby, but Conroes start showing their true scary potential only with 4MB L2 caches, which start with the E6600 (about $300).
  5. *sigh* My bad. Okay, I'll try again. Forget my previous post. This one's completely factual, no innuendos. As of today, at all price points, Intel's solutions are overwhelmingly superior to AMD's offerings. Intel's current platform will remain viable as an upgrade path for at least 1 year, and it is likely that processors will be available for the platform even 2 years from now, although there will undoubtedly be better platform offerings by this time. Intel's processors (in quad form, oct form, whatever) will very likely continue to outperform AMD counterparts for desktop applications until AMD releases their new K8L microarchitecture sometime in 2007. It is impossible to predict if K8L will be superior or inferior to Conroe.
  6. What exactly do you define as high-end? A $300 Intel E6600 that beats the socks off a $700 AMD FX-62? Or a $180 Intel E6300 that shames a $280 AMD X2 4800+? Sorry to use a confrontationist tone, but from the specs you have just posted it looks like you are going for a full (mobo + CPU + RAM + graphics) upgrade. You've also mentioned that you do not care much about a long-lasting upgrade path. Given these facts there's absolutely no reason to go AMD at this point in time except for brand loyalty (which I do not understand but do respect). In fact, since it has been demonstrated that a quad-core Kentsfield will usually just snap into a 965 socket, you have a pretty darn nice upgrade path with Intel too.
  7. Yup. Given that you're looking at a 2-year timeframe: If you're planning to upgrade your processor after 1 year, then go for AM2. If you want to buy a processor now and use it for 2 years, then an AM2 motherboard (+DDR2 memory) would be a worthless investment. Just buy the best 939 CPU you can afford.
  8. As a slashdot reader mentioned, the point is that not many people know otherwise. The average Joe doesn't even know there are alternatives. Also, don't forget that regardless of what Media Player you use, you'll be running it on Windows. How long before MS transfers these functionalities to the OS core? WMP could just be a testbed.
  9. Slashdot: Microsoft DRM To Get Even Tighter Inquirer: Microsoft Media Player shreds your rights
  10. I've always prefered Gnome over KDE, simply because it appears to be more consistent. Dunno if that even made sense. I just like it.
  11. Frankly, I don't think IE or Aurora based games' pseudo-real-time systems are all that better. Characters' behaviors in modern games with graphical frontends look extremely silly in turn-based combat with such long turn-switch granularities. An ideal combat system that both looks and feels accurate, as well as accurately models the behavior of player-independent stats-dependent combat, should have infinitesimally small turn-switch granularities (or time-slices, if you will). Of course, combat action lengths would need to be adjusted in terms of the number of time-slices needed to perform them.
  12. CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON CANON
  13. I was talking about open standards, not open source. Much as I love and use open source software, I can understand when people claim it is not entirely feasible to build a profitable business around the model. Open standards are publicly-available protocols that anyone is free to create content or software for (e.g., TCP/IP, POP, the Windows API, the x86 ISA, etc.). I'm not sure if I'm using this term accurately either, but all I want to say is that just like I can write an HTTP browser for myself to browse the internet, and just like I can write my own OS for my Intel machine, I should be able to write a DVD decoder software for myself if I wish and distribute it to my friends.
  14. It is very hard to make an open-standards system work when your customer and your adversary are one and the same. Once you start treating your customer as your adversarial intruder, the only way in which you can get them to use the content you provide according to your dictated terms and conditions is through a fortress-like closed system. Open standards can work if these policies generate enough distaste among customers to convince them to start moving towards more independent content providers that are willing to work with open-standards systems. Without a drive from both the consumers as well as the content providers, it's probably not going to happen.
  15. Oh, they've thought this through :D. Your content will be encoded using a key such that it can be decoded only by favored software and hardware. Together, the hardware, software, content and display device form a system stack that is (ideally) completely impenetrable. They already tried this to a certain extent with CSS for DVDs. CSS was designed such that if you wished to write software to decode a DVD, you'd have to register your company with the regulatory authority and license the key from them. This is why a lot of open-source DVD decoders cannot play CSS-encrypted DVDs. Unfortunately (or fortunatel,y depending on your point of view), the system had several pitfalls, since only the OS, decoder and content formed the "secure" subsystem: (1) you could theoretically hack into the hardware that was running the secure OS/decoder, (2) you could redirect the final VGA output away from your display device into a capture device, and (3) the CSS key system itself was easily cracked making the whole deal a joke. This time, they have been much more careful. The MPAA has pulled Intel, Microsoft, and Monitor manufacturers into the cartel. Your content, hardware, software and even your display device are now part of the secure subsystem. As long as you are using un-encrypted content, you should have no issues. But if you use encrypted content, all components of your system involved in translating the content from its original form to something that you can perceive needs to be "compliant" to the requirements of the content provider.
  16. I don't know about the 940BF, but my lab just bought a bunch of 931BF's, and they are horrible. The construction quality is superb, and it looks like a very slick product. The panel itself is complete garbage. You can literally see the 6-bit to 8-bit dithering going on. Unfortunately, I don't expect any of these "2ms" or "1ms" or "0.001ns" panels to be any good. As far as human perception is concerned, I don't believe anything below 8ms is necessary even for the fastest FPS's or the most contrasty and dynamic movies. The old Dell 1901's in my lab also have 6-bit TN panels, but they do a better job of hiding the dithering than the new Samsungs. It could be that the Samsungs have a superior panel coating and this unfortunately reveals the ugly dithering going on in the background. I'll say what I always say and people never seem to listen or care: stay away from 6-bit TN panels, get an LCD with a true 8-bit panel.
  17. I think I know what you're asking. Basically, you need to have a bulletproof "secure software" running on top of the secure hardware in order to achieve a totally secure system. An ideal secure hardware does two things: (a) provide abstractions so that secure software can be written to run on it, and (b) guarantees that as long as the software running on top of it is perfectly secure, it is impossible for a hacker to exploit any hardware weaknesses to compromise the system (e.g., physically tapping into the processor-memory bus). All layers of a secure system need to be perfect in order to make it watertight; any single loophole could compromise the entire system.
  18. Aah... my bad, I thought the base price included only the classic controller. So, it's $250 for the Wii vs. $350 (assuming drop) for the 360. Or, if you want 2 sets of controllers (a typical scenario), it's $310 for the Wii vs. $390 for the 360. I'd still pick the 360. <_<
  19. I'm a bit disappointed. $250 + $40 + $20 = $310. I don't think a new Gamecube revision is worth that price. Sounds like a forced upgrade to all Gamecube owners. Nitendo is now going to make all new games only for the new and shiny Gamecube, so you'll have to buy the new system if you want to play the new games. The system in itself has very little intrinsic value. Of course, things would have looked entirely different had the price point been $199 instead of $310. I think this is something the "it's all about the games, stupid" folks completely miss. The set of games for a new console should provide a significantly improved experience to justify the hardware investment. Looking at the Wii, I don't see any such thing. Nintendo might come up with some amazing new never-before-seen games, but they could probably have created those games for the Gamecube as well. It's a bit of a stretch, but I'll go out on a limb and claim that the same story also holds true for the controllers. I'm probably going to pick up a 360 when the next price cut hits. You can get a 20GB system with wireless controllers for $399, and I suspect this will drop to $350 or less within a few months. Much better value for money than the $310 Wii IMO, and the catalogue is looking better and better each day.
  20. But only one of them can bear the title of "Best Game EVAR"...
  21. It's not really a true "quad-core", but two dual-core Conroe chips glued together into a single package for a total of 4 cores. Of course, as an end-user you'll just see a single "chip" that you can plug into a 965 or 975X motherboard. http://www.tomshardware.com/2006/09/10/fou...on_the_rampage/
  22. If you're lucky, it'll hit 4.0.
  23. Now there's a man who has his priorities straight!
×
×
  • Create New...